Skip to content

avaly/backup-to-cloud

Repository files navigation

backup-to-cloud

Github Actions NPM version Install size

A simple backup tool which uploads encrypted files to S3, in batches.

Ideally, it should be setup to run in a crontab entry.

Features

  • Encrypts files locally with gpg
  • Uploads files to S3 in batches of customizable size
  • Support for uploading a tar archive of files in certain folders, useful for sources with thousands of files (e.g. photo library)
  • Rescans sources at specific intervals to find new or updated files
  • Removes files from S3 if they are removed locally

Requirements

  • OS: Linux, MacOS (untested)
  • node.js v22+
  • awscli 1.8.6+ (for support of STANDARD_IA storage class)
  • find
  • gpg
  • tar

Install

  • aws configure

  • npm ci

  • bin/backup-to-cloud init

  • Modify your new config file

  • Check your config file: bin/backup-to-cloud check-config

  • Try a scan first with: bin/backup-to-cloud scan --dry

  • Try it out first with: bin/backup-to-cloud backup --dry

  • Set up some crontab entries for it, for example:

    • run every hour with verbose logging:
    0 * * * * cd /path/to/this && ./bin/backup-to-cloud backup --verbose >> cron-backup.log 2>&1
    
    • run backup every 12 hours:
    0 */12 * * * cd /path/to/this && ./bin/backup-to-cloud backup >> cron-backup.log 2>&1
    
    • run scan every day:
    0 7 * * * cd /path/to/this && ./bin/backup-to-cloud scan >> cron-scan.log 2>&1
    

Commands

init

Create a default config file from the sample:

./bin/backup-to-cloud init

backup

./bin/backup-to-cloud backup --help
./bin/backup-to-cloud backup --dry
./bin/backup-to-cloud backup

scan

Scan sources and update the local DB without uploading files:

./bin/backup-to-cloud scan --help
./bin/backup-to-cloud scan --dry
./bin/backup-to-cloud scan

check-config

Validate your config file and required binaries:

./bin/backup-to-cloud check-config

restore

Restore a file or folder and decrypt:

./bin/backup-to-cloud restore --help
./bin/backup-to-cloud restore --output OUTPUT_DIR_OR_FILE REMOTE_DIR_OR_FILE

Schedule a restore test:

0 1 * * * cd /path/to/this && ./bin/backup-to-cloud restore --max-size 1000000 --output TEMPORARY_DIR --test 0 / >> restore-test.log 2>&1

decrypt

Decrypt a downloaded encrypted file:

./bin/backup-to-cloud decrypt --help
./bin/backup-to-cloud decrypt --output OUTPUT_FILE INPUT_FILE

verify

Verify that the DB and remote files are in sync:

./bin/backup-to-cloud verify --help
./bin/backup-to-cloud verify --dry
./bin/backup-to-cloud verify

About

A simple backup tool which uploads encrypted files to S3

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors