Simple postgres backups with cron and S3

We create a bash script which will be run by cron - ~/


echo "Starting backups"
bak1=$project.$(date +%F).pg.backup
bak2=$project.$(date +%F).pg.sql.backup

mkdir -p /home/ox/backups/

echo "Creating Backup\n - $bak1 \n - $bak2"
pg_dump --format=custom --file /home/ox/backups/$bak1 $dbname
pg_dump --file /home/ox/backups/$bak2 $dbname

echo "Uploading Backups to $bucket"
aws s3api put-object --body /home/ox/backups/$bak1 --key $project/$bak1 --bucket $bucket --server-side-encryption AES256
aws s3api put-object --body /home/ox/backups/$bak2 --key $project/$bak2 --bucket $bucket --server-side-encryption AES256

echo "Done"

Then we create ~/aws.env file with our S3 bucket credentials:

export AWS_ACCESS_KEY_ID=1234
export AWS_BUCKET=bucketname

Then we set a cron entry using crontab -e

0 20 * * * /bin/bash /home/ox/ project dbname bucketname && curl -fsS -m 10 --retry 5 -o /dev/null$weebhokId

This cron reports success / failure to which inturn posts an error on slack if the backup fails to report a success.

That is as simple as it gets. I’ve been using this for my hobby projects for quite some time now.

For more serious projects, I have some more bells and whistles like automating this with ansible, db versioning, old backup pruning etc, if you want to know about it or have any questions tweet me at @oxalorg