Automated backup system that dumps all PostgreSQL databases and uploads them to Google Drive. Designed to run as a Linux cron job with absolute paths.
- Automatically discovers all databases on PostgreSQL server
- Creates timestamped .sql dump files for each database
- Uploads dumps to Google Drive
- Full logging with rotation support
- Cleanup of local files after successful upload
/usr/bin/python3 -m pip install -r /Users/owner/Documents/nskh/requirements.txtEnsure PostgreSQL is running and you have:
- Database username and password
- Permission to access all databases you want to back up
- Go to Google Cloud Console
- Create a new project or select existing one
- Enable Google Drive API
- Create a Service Account:
- Go to IAM & Admin > Service Accounts
- Create Service Account
- Grant it appropriate permissions
- Create and download JSON key file
- Save the credentials file as
credentials.json - (Optional) Share a Google Drive folder with the service account email
Copy the example environment file and edit it:
cp /Users/owner/Documents/nskh/.env.example /Users/owner/Documents/nskh/.envEdit .env and update these values:
PG_HOST=localhost
PG_PORT=5432
PG_USER=postgres
PG_PASSWORD=your_password
CREDENTIALS_FILE=/absolute/path/to/credentials.json
DRIVE_FOLDER_ID=your_folder_id_or_leave_emptyImportant: All paths must be absolute for cron job compatibility.
/usr/bin/python3 /Users/owner/Documents/nskh/app.pyEdit crontab:
crontab -eAdd one of these entries:
Daily backup at 2:00 AM:
0 2 * * * /usr/bin/python3 /Users/owner/Documents/nskh/app.py >> /var/log/pg_backup/cron.log 2>&1Every 6 hours:
0 */6 * * * /usr/bin/python3 /Users/owner/Documents/nskh/app.py >> /var/log/pg_backup/cron.log 2>&1Weekly backup (Sunday at 3:00 AM):
0 3 * * 0 /usr/bin/python3 /Users/owner/Documents/nskh/app.py >> /var/log/pg_backup/cron.log 2>&1Backup files are named: {database_name}_{timestamp}.sql
Example: myapp_db_20260115_020000.sql
Logs are stored in /var/log/pg_backup/ with daily rotation.
nskh/
├── app.py # Main application
├── modules/
│ ├── pg-server.py # PostgreSQL backup logic
│ └── drive-service.py # Google Drive upload logic
├── requirements.txt # Python dependencies
├── .env # Environment configuration (not in git)
├── .env.example # Environment configuration template
├── credentials.json # Google service account credentials (not in git)
└── README.md # This file
- Verify PostgreSQL is running:
systemctl status postgresql - Check pg_hba.conf for authentication settings
- Ensure user has proper permissions
- Verify credentials.json is valid
- Check service account has necessary permissions
- If using folder ID, ensure folder is shared with service account email
- Check cron service:
systemctl status cron - Verify absolute paths in crontab
- Check cron logs:
grep CRON /var/log/syslog - Ensure script has execute permissions:
chmod +x /Users/owner/Documents/nskh/app.py
Create log directory with proper permissions:
sudo mkdir -p /var/log/pg_backup
sudo chown $USER:$USER /var/log/pg_backup- Keep
credentials.jsonsecure and never commit to git - Consider encrypting sensitive configuration data
- Regularly rotate service account keys
- Monitor Google Drive storage usage
- Consider implementing backup retention policies