Sunday, 28 May 2017

Backup Mysql Database to AWS S3 Bucket

It is very important to make regular backups of your data to protect it from loss.In this tutorial we will use a bash script which automates your MySQL database backups on daily,weekly or as per your requirement basis.The Linux s3cmd script can easily be used to back up your MySQL databases. 

Create S3 Bucket :

Login to your AWS account and go to the "Services > S3" then click on "Create Bucket". It will prompt you to provide bucket name whenever you create any S3 bucket always maintain format for bucket name which helps to manage multiple buckets in standard way.

Create IAM User :

Once the bucket is created, we need one user who has complete rights to access the bucket. To create user go to "Services-->IAM-->Users". 

After click on users give name of user here we are created user name as :


Create a new policy for S3 Bucket and assign to the new user so that the user can Read/Write in the bucket 

Click on create Own policy option

Update the below S3 Bucket policy in the Policy Document section 

"Version": "2012-10-17",
"Statement": [
  "Effect": "Allow",
  "Action": ["s3:ListAllMyBuckets"],
  "Resource": "arn:aws:s3:::*"
  "Effect": "Allow",
  "Action": [
  "Resource": "arn:aws:s3:::dptsourcebackup"

  "Effect": "Allow",
  "Action": [
  "Resource": "arn:aws:s3:::dptsourcebackup/*"


Attach the policy to new user

Download the Access Key and Secret Key of the IAM User and note it.

Once you have done with S3 bucket configuration, follow below provided steps to configure MySQL database backup script on the server.

Step 1 : Install s3cmd Utility 

s3cmd is a command line utility used for creating s3 buckets, uploading, retrieving and managing data to Amazon s3 storage. 

For CentOS/RHEL 6 Server 

$ wget

$ yum install install s3cmd

For CentOS/RHEL 7 Server 

$ wget
$ tar xzf s3cmd-1.6.1.tar.gz

$ cd s3cmd-1.6.1
$ sudo python install

For Ubuntu/Debian 

$ sudo apt-get install s3cmd

Step 2 : Configure s3cmd Environment 

In order to configure s3cmd we would require Access Key and Secret Key of your S3 Amazon account.We have already downloaded these keys earlier.

$ sudo s3cmd --configure

Output :

Enter new values or accept defaults in brackets with Enter.
Refer to user manual for detailed description of all options.

Access key and Secret key are your identifiers for Amazon S3
Access Key: xxxxxxxxxxxxxxxxxxxxxx
Secret Key: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

Encryption password is used to protect your files from reading
by unauthorized persons while in transfer to S3
Encryption password: xxxxxxxxxx
Path to GPG program [/usr/bin/gpg]:

When using secure HTTPS protocol all communication with Amazon S3
servers is protected from 3rd party eavesdropping. This method is
slower than plain HTTP and can't be used if you're behind a proxy
Use HTTPS protocol [No]: Yes

Test access with supplied credentials? [Y/n] Y
Please wait, attempting to list all buckets...
Success. Your access key and secret key worked fine :-)

Now verifying that encryption works...
Success. Encryption and decryption worked fine :-)

Save settings? [y/N] y
Configuration saved to '/root/.s3cfg'

Step3 : Configure Backup Script 

Copy below backup script in your directory "/opt" and give name for file "".


# Be pretty
echo -e " "
echo -e " Amazon Web Service S3 Mysql Backup Script "
echo -e " "

# Basic variables

# Timestamp (sortable AND readable)
stamp=`date +"%s - %A %d %B %Y @ %H%M"`

# List all the databases and eliminate the default
databases=`mysql -u root -p$mysqlpass -e "SHOW DATABASES;" | tr -d "| " | grep -v "\(Database\|information_schema\|performance_schema\|mysql\|test\)"`

# Feedback
echo -e "Dumping to \e[1;32m$bucket/$stamp/\e[00m"

# Loop the databases
for db in $databases; do

  # Define our filenames

  # Feedback
  echo -e "\e[1;34m$db\e[00m"

  # Dump and zip
  echo -e "  creating \e[0;35m$tmpfile\e[00m"
  mysqldump --single-transaction -u$mysqluser -p$mysqlpass --databases "$db" | gzip -c > "$tmpfile"

  # Upload
  echo -e "  uploading..."
  s3cmd put "$tmpfile" "$object"

  # Delete
  rm -f "$tmpfile"


# Jobs a goodun
echo -e "\e[1;32mJob completed\e[00m"

Change Permission and Test Backup Script 

$ sudo chmod +x

$ sudo bash -x


Step 4 : Configure a Cronjob 

We will schedule our backup script to execute at a particular time.

Below cron execute at 2AM IST

00 02 * * * root /bin/bash /opt/scripts/  > /tmp/mysqlBackups.log 2>&1

Creating New Bucket :

s3cmd mb s3://dptsource

Uploading file in Bucket :

s3cmd put tech.txt s3://dptsource/

Uploading Directory in Bucket

s3cmd put -r backup s3://dptsource/

List Data of S3 Bucket

s3cmd ls s3://dptsource/

List Data of directory in S3 Bucket

s3cmd ls -r s3://dptsource/

Download Files from Bucket

s3cmd get s3://dptsource/tech.txt

Remove Data of S3 Bucket

s3cmd del s3://dptsource/tech.txt

Remove S3 Bucket :

s3cmd rb s3://dptsource/

Bucket Policy and Backup script can be downloaded from :

Video Tutorial 


  1. Wonderful blog on Cloud domain, Thank you sharing the informative article with us. Hope your article will reach top of the SERP result to the familiar cloud related queries
    Cloud Computing Courses
    Cloud computing course in Chennai

  2. Thanks for your valuable feedback

  3. Those guidelines additionally worked to become a good way to
    recognize that other people online have the identical fervor like mine
    to grasp great deal more around this condition.

    AWS Training in Bangalore

    AWS Training in Bangalore

  4. I believe there are many more pleasurable opportunities ahead for individuals that looked at your site.

  5. I simply wanted to write down a quick word to say thanks to you for those wonderful tips and hints you are showing on this site.

  6. I feel really happy to have seen your webpage and look forward to so many more entertaining times reading here. Thanks once more for all the details.

    Amazon Web Services Training in Chennai

    Best Java Training Institute Chennai

  7. Those guidelines additionally worked to become a good way to recognize that other people online have the identical fervor like mine to grasp great deal more around this condition. . aws training in chennai

  8. It was a nice article on AWS backup options and clearly shows the importance of backup in cloud computing. Thanks for sharing very useful content.

  9. ITS Technology Solution Private Limited (itSimple) is a value based IT backup focused organisation.
    itSimple is offering consistently improving world-class products, services & knowledge in backup (Laptop/Desktop, & Servers) along with file Archival for media.
    See More Here at :
    Cloud Backup services
    Data Backup services

  10. Excellent and very cool idea and the subject at the top of magnificence and I am happy to this post..

    aws training in chennai

    digital marketing training in chennai


  11. Excellent knowledge was shared, it helps me to explore more things.

    Cloud Computing Courses

    Cloud computing course in Chennai

  12. Great Explanation and more useful for anyone.Thanks for sharing...
    oracle course in chennai

  13. itSimple was created to make life simple for customers & partners of Atempo Backup & Archival solutions in India after Atempo acquisition by ASG in 2012. Currently itSimple supports 180+ enterprise & SMB customers through its network of Channel Partners and has also introduced selective path breaking solutions to the Indian market.

    cloud backup solution
    Data protection

  14. This comment has been removed by the author.

  15. This blog gives a lots of Information, It's Useful,Thanks for the Information

    AWS Online Training