How to create your own Automated Backup Scripts in Linux with S3

Posted on Posted in Custom Programming, Stories

As a software development company, we are often tasked to create backup scripts to ensure data is recoverable in case of catastrophic failure. I’m sharing below the basic script that we use to some clients that require automated daily backup script in Linux and Amazon S3. This script supports backup of files and database into a local storage and transfer the backup files to Amazon S3.

Step 1: Install s3cmd to access the S3 bucket from your server. (Optional: S3 backup only)

sudo apt-get install s3cmd
s3cmd --configure

The option –configure will prompt for S3 bucket and credentials. Just follow the prompt, you may get this information from your AWS S3 Console using this guide. At the end of the command, s3cmd will prompt you to test and verify connection.

Step 2: Create the backup script.

Copy the script below which allows backup of your database and file system into your backup folder. The script includes command to transfer backup files to S3. You may remove them should you not need to transfer the files to s3.

#!/bin/sh

#backup folder within the server where backup files are stored
BAK_DEST=~/backup

#database credentials
DB_USERNAME=[username]
DB_PASSWORD=[password]
DB_SCHEMA=[schema]

#folders for backup, can be comma separated for multiple folders
BAK_SOURCES=[folder_A, folder_B]

#s3 bucket name that contains backup
S3_BUCKET=[s3_bucket]

#number of days to keep archives
KEEP_DAYS=7

#script variables
BAK_DATE=`date +%F`
BAK_DATETIME=`date +%F-%H%M`
BAK_FOLDER=${BAK_DEST}/${BAK_DATE}
BAK_DB=${BAK_FOLDER}/db-${BAK_DATETIME}

#CREATE folder where backup database is to be place
echo 'Creating database back up ' ${BAK_FOLDER}
mkdir ${BAK_FOLDER}

#PERFORM mySQL Database DUMP
echo 'Creating archive file ' ${BAK_DB}'.tar.gz Pls wait...'
mysqldump -u ${DB_USERNAME} -p${DB_PASSWORD} ${DB_SCHEMA} > ${BAK_DB}.sql
tar czPf ${BAK_DB}.tar.gz ${BAK_DB}.sql

echo 'Copying database backup to S3 ...'
s3cmd put ${BAK_DB}.tar.gz s3://${S3_BUCKET}/backup/db/db-${BAK_DATETIME}.tar.gz

#ARCHIVING FILES / FOLDER
echo 'Archiving files and folders...'

FOLDERS=$(echo $BAK_SOURCES | tr "," "\n")
i=0
for F in $FOLDERS
do
  echo 'Archiving ' ${F} '...'
  i=`expr ${i} + 1`
  tar czPf ${BAK_FOLDER}/FILE_${i}_${BAK_DATETIME}.tar.gz ${F}
  s3cmd put ${BAK_FOLDER}/FILE_${i}_${BAK_DATETIME}.tar.gz s3://${S3_BUCKET}/backup/files/FILE_${i}_${BAK_DATETIME}.tar.gz
done

#DELETE FILES OLDER THAN 7 days
echo 'Deleting backup older than '${KEEP_DAYS}' days'
find ${BAK_FOLDER} -type f -mtime +${KEEP_DAYS} -name '*.gz' -execdir rm -- {} \;

The first part of the script contains user variables that should be replaced with your application credentials (e.g. database username and password, backup folders, etc.).

For your reference, the operation of the script is as follows:

Automated Backup Script

  1. Backup database using mysqladmin.
  2. Compress database backup.
  3. Send backup to S3.
  4. Loop all the source folders.
  5. Compress the folder.
  6. Send backup to S3.
  7. Delete all files older than 7 days old.

Step 3: Test and setup cron job

To test the script, simply execute the backup script. The script shall display messages while the program runs and you may verify by checking the backup file created.

   $ ./backup.sh

To setup the cronjob, execute command below:

   $ crontab -e

Then, configure schedule

@daily ~/.backup.sh

That’s it! You should have your daily backups automatically running.

P.S. Please make sure to test the backup files created.

Automating your backup solution is cost effective and saves time. If you’re a client looking for a software development company for your automation project, please don’t hesitate to contact us today.
Send us a message

2 thoughts on “How to create your own Automated Backup Scripts in Linux with S3

  1. Hi, very useful script. But it is possible to delete backups on server when they are stored inside s3 bucket?
    Thank

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.