A Python CLI tool for backup and restore across:
- MySQL
- PostgreSQL
- MongoDB
- SQLite
It supports optional compression and AWS S3 upload/download.
- Backup and restore for 4 database types
- Optional
.tar.gzcompression - AWS S3 upload for backups
- Restore directly from S3
- latest object in bucket
- latest object under a prefix
- exact object key
- Env-only configuration (
.env) - Docker + Docker Compose support
- Structured logging via
--log-file
- Parse command arguments (
backuporrestore). - Load env values from
.env(or--env-file). - Validate DB config and optional S3 config.
- Backup flow:
- Connect to DB
- Run DB-specific dump
- Optionally compress
- Optionally upload to S3
- Restore flow:
- Use local backup file or download from S3
- Decompress if needed
- Run DB-specific restore
| Variable | Required | Description |
|---|---|---|
DB_HOST |
Yes | Database host |
DB_PORT |
Yes | Database port |
DB_NAME |
Yes | Database name (or SQLite file path) |
DB_USER |
Optional* | Database username |
DB_PASSWORD |
Optional* | Database password |
DB_AUTH_DATABASE |
Optional | Mongo auth source DB (defaults to admin if auth is used) |
AWS_ACCESS_KEY |
Required for S3 | AWS access key |
AWS_SECRET_KEY |
Required for S3 | AWS secret key |
AWS_REGION |
Required for S3 | AWS region |
* For MongoDB, set DB_USER and DB_PASSWORD together, or leave both empty.
docker pull chinu1710/db-backup-cli:1.0.0cp .env.example .env
mkdir -p dataMongoDB:
sudo docker run --rm \
--network host \
--env-file .env \
-v "$(pwd)/data:/app/data" \
chinu1710/db-backup-cli:1.0.0 \
backup --db-type mongodb --output /app/data/mongodb-backup --compress --cloud s3 --bucket your-s3-bucket-name --log-file /app/data/backup.logMySQL:
sudo docker run --rm \
--network host \
--env-file .env \
-v "$(pwd)/data:/app/data" \
chinu1710/db-backup-cli:1.0.0 \
backup --db-type mysql --output /app/data/mysql-backup.sql --compress --cloud s3 --bucket your-s3-bucket-name --log-file /app/data/backup.logPostgreSQL:
sudo docker run --rm \
--network host \
--env-file .env \
-v "$(pwd)/data:/app/data" \
chinu1710/db-backup-cli:1.0.0 \
backup --db-type postgresql --output /app/data/postgres-backup.dump --compress --cloud s3 --bucket your-s3-bucket-name --log-file /app/data/backup.logSQLite:
sudo docker run --rm \
--network host \
--env-file .env \
-v "$(pwd)/data:/app/data" \
chinu1710/db-backup-cli:1.0.0 \
backup --db-type sqlite --output /app/data/sqlite-backup.sql --compress --cloud s3 --bucket your-s3-bucket-name --log-file /app/data/backup.logLocal backup:
sudo docker run --rm \
--network host \
--env-file .env \
-v "$(pwd)/data:/app/data" \
chinu1710/db-backup-cli:1.0.0 \
restore --db-type mongodb --backup-file /app/data/mongodb-backup.tar.gz --log-file /app/data/restore.logLatest from S3 bucket:
sudo docker run --rm \
--network host \
--env-file .env \
-v "$(pwd)/data:/app/data" \
chinu1710/db-backup-cli:1.0.0 \
restore --db-type mongodb --cloud s3 --bucket your-s3-bucket-name --download-dir /app/data --log-file /app/data/restore.logSpecific S3 key:
sudo docker run --rm \
--network host \
--env-file .env \
-v "$(pwd)/data:/app/data" \
chinu1710/db-backup-cli:1.0.0 \
restore --db-type mongodb --cloud s3 --bucket your-s3-bucket-name --s3-key your-backup-object-key.tar.gz --download-dir /app/data --log-file /app/data/restore.logpython -m venv venv
source venv/bin/activate
pip install -r requirements.txt
cp .env.example .envpython cli.py --help
python cli.py backup --help
python cli.py restore --helpBackup:
python cli.py backup \
--db-type mongodb \
--output ./mongodb-backup \
--compress \
--cloud s3 \
--bucket your-s3-bucket-name \
--log-file backup.logRestore from specific S3 key:
python cli.py restore \
--db-type mongodb \
--cloud s3 \
--bucket your-s3-bucket-name \
--s3-key your-backup-object-key.tar.gz \
--download-dir . \
--log-file restore.logRestore from local backup:
python cli.py restore \
--db-type mongodb \
--backup-file ./mongodb-backup.tar.gz \
--log-file restore.log