Jay
Jay

Reputation: 3479

Automatically Backup MySQL database on linux server

I need a script that automatically makes a backup of a MySql Database. I know there are a lot of posts and scripts out there on this topic already but here is where mine differs.

  1. The script needs to run on the machine hosting the MySql database (It is a linux machine).
  2. The backups must be saved onto the same server that the database is on.
  3. A backup needs to be made every 30 minutes.
  4. When a backup is older than a week it is deleted unless it is the very first backup created that week. i.e out of these backups backup_1_12_2010_0-00_Mon.db, backup_1_12_2010_0-30_Mon.db, backup_1_12_2010_1-00_Mon.db ... backup_7_12_2010_23-30_Sun.db etc only backup_1_12_2010_0-00_Mon.db is kept.

Anyone have anything similar or any ideas where to start?

Upvotes: 14

Views: 50206

Answers (7)

apogoreliy
apogoreliy

Reputation: 91

After a brief reading the question and the good answers i would add few more points. Some of them are mentioned already.

The backup process can involve next steps:

  1. Create a backup
  2. Compress the backup file
  3. Encrypt the compressed backup
  4. Send the backup to a cloud (DropBox, OneDrive, GoogleDrive, AmazonS3,...)
  5. Get a notification about results
  6. Setup a schedule to run the backup process periodically
  7. Delete the old backup files

To compound a script to cover all the backup steps you need an effort and knowledge.

I would like to share a link to an article (i'm one of the writers) which describes the most used ways to backup MySQL databases with some details:

  1. Bash script

     # Backup storage directory  
     backup_folder=/var/backups
    
     # Notification email address 
     recipient_email=<[email protected]>
    
     # MySQL user
     user=<user_name>
    
     # MySQL password
     password=<password>
    
     # Number of days to store the backup 
     keep_day=30 
    
     sqlfile=$backup_folder/all-database-$(date +%d-%m-%Y_%H-%M-%S).sql
     zipfile=$backup_folder/all-database-$(date +%d-%m-%Y_%H-%M-%S).zip 
    
     # Create a backup 
     sudo mysqldump -u $user -p$password --all-databases > $sqlfile 
    
     if [ $? == 0 ]; then
        echo 'Sql dump created' 
     else
        echo 'mysqldump return non-zero code' | mailx -s 'No backup was created!' $recipient_email  
        exit 
     fi 
    
     # Compress backup 
     zip $zipfile $sqlfile 
    
     if [ $? == 0 ]; then
        echo 'The backup was successfully compressed' 
     else
        echo 'Error compressing backup' | mailx -s 'Backup was not created!' $recipient_email 
        exit 
     fi 
    
     rm $sqlfile 
    
     echo $zipfile | mailx -s 'Backup was successfully created' $recipient_email 
    
     # Delete old backups 
     find $backup_folder -mtime +$keep_day -delete
    
  2. Automysqlbackup

     sudo apt-get install automysqlbackup
     wget https://github.com/sixhop/AutoMySQLBackup/archive/master.zip
    
     mkdir /opt/automysqlbackup
     mv AutoMySQLBackup-master.zip 
     cd /opt/automysqlbackup
     tar -zxvf AutoMySQLBackup-master.zip
    
     ./install.sh
    
     sudo nano /etc/automysqlbackup/automysqlbackup.conf
    
     CONFIG_configfile="/etc/automysqlbackup/automysqlbackup.conf"
     CONFIG_backup_dir='/var/backup/db'
     CONFIG_mysql_dump_username='root'
     CONFIG_mysql_dump_password='my_password'
     CONFIG_mysql_dump_host='localhost'
     CONFIG_db_names=('my_db')
     CONFIG_db_exclude=('information_schema')
     CONFIG_mail_address='[email protected]'
     CONFIG_rotation_daily=6
     CONFIG_rotation_weekly=35
     CONFIG_rotation_monthly=150
    
     automysqlbackup /etc/automysqlbackup/automysqlbackup.conf
    
  3. Third party tools

Hope it would be helpful!

Upvotes: 4

Alfabravo
Alfabravo

Reputation: 7569

Answer: A cron

Description:

Try creating a file something.sh with this:

 #!/bin/sh
 mysqldump -u root -p pwd --opt db1.sql > /respaldosql/db1.sql
 mysqldump -u root -p pwd --opt db2.sql > /respaldosql/db2.sql
 cd /home/youuser/backupsql/
 tar -zcvf backupsql_$(date +%d%m%y).tgz *.sql
 find -name '*.tgz' -type f -mtime +2 -exec rm -f {} \;

Give the adequate permission to the file

 chmod 700 mysqlrespaldo.sh

or

 sudo chmod 700 something.sh

and then create a cron with

 crontab -e

setting it like

 **0 1 * * *** /home/youruser/coolscripts/something.sh

Remember that the numbers or '*' characters have this structure:

Minutes (range 0-59)
Hours (0-23)
Day of month (1-31)
Month (1-12)
Day of the week (0-6 being 0=Domingo)
Absolute path to script or program to run

You can also use the helper folder available in newer versions of linux distros, where you find /etc/cron.daily, /etc/cron.hourly, /etc/cron.weekly, etc. In this case, you can create a symlink to your script into the chosen folder and OS will take charge of running it with the promised recurrence (from a powerful comment by @Nick).

Upvotes: 25

ggg
ggg

Reputation: 307

You might consider this Open Source tool, matiri, https://github.com/AAFC-MBB/matiri which is a concurrent mysql backup script with metadata in Sqlite3. Features (more than you were asking for...):

  • Multi-Server: Multiple MySQL servers are supported whether they are co-located on the same or separate physical servers.
  • Parallel: Each database on the server to be backed up is done separately, in parallel (concurrency settable: default: 3)
  • Compressed: Each database backup compressed
  • Checksummed: SHA256 of each compressed backup file stored and the archive of all files
  • Archived: All database backups tar'ed together into single file
  • Recorded: Backup information stored in Sqlite3 database

Full disclosure: original matiri author.

Upvotes: 0

shaneonabike
shaneonabike

Reputation: 301

My preference is for AutoMySQLBackup which comes with Debian. It's really easy and creates daily backups, which can be configured. As well, it stores on weekly and then one monthly backup as well.

I have had this running for a while and it's super easy to configure and use!

Upvotes: 2

Ankit Singhania
Ankit Singhania

Reputation: 1010

Create a shell script like the one below:

#!/bin/bash
mysqldump -u username -p'password' dbname > /my_dir/db_$(date+%m-%d-%Y_%H-%M-%S).sql
find /mydir -mtime +10 -type f -delete

Replace username, password and your backup directory(my_dir). Save it in a directory(shell_dir) as filename.sh

Schedule it to run everyday using crontab -e like:

30 8 * * * /shell_dir/filename.sh

This will run everyday at 8:30 AM and backup the database. It also deletes the backup which is older than 10 days. If you don't wanna do that just delete the last line from the script.

Upvotes: 10

hornetbzz
hornetbzz

Reputation: 9357

Doing pretty much the same like many people.

  1. The script needs to run on the machine hosting the MySql database (It is a linux machine).
    => Create a local bash or perl script (or whatever) "myscript" on this machine "A"

  2. The backups must be saved onto the same server that the database is on.
    => in the script "myscript", you can just use mysqldump. From the local backup, you may create a tarball that you send via scp to your remote machine. Finally you can put your backup script into the crontab (crontab -e).

Some hints and functions to get you started as I won't post my entire script, it does not fully do the trick but not far away :

#!/bin/sh
...
MYSQLDUMP="$(which mysqldump)"   
FILE="$LOCAL_TARBALLS/$TARBALL/mysqldump_$db-$SNAPSHOT_DATE.sql"  
$MYSQLDUMP -u $MUSER -h $MHOST -p$MPASS $db > $FILE && $GZIP $GZ_COMPRESSION_LEVEL $FILE   

function create_tarball()
{
local tarball_dir=$1
tar -zpcvf $tarball_dir"_"$SNAPSHOT_DATE".tar.gz" $tarball_dir >/dev/null
return $?
}

function send_tarball()
{
local PROTOCOLE_="2"
local IPV_="4"
local PRESERVE_="p"
local COMPRESSED_="C"
local PORT="-P $DESTINATION_PORT"
local EXECMODE="B"

local SRC=$1
local DESTINATION_DIR=$2
local DESTINATION_HOST=$DESTINATION_USER"@"$DESTINATION_MACHINE":"$DESTINATION_DIR

local COMMAND="scp -$PROTOCOLE_$IPV_$PRESERVE_$COMPRESSED_$EXECMODE $PORT $SRC $DESTINATION_HOST &"

echo "remote copy command: "$COMMAND
[[ $REMOTE_COPY_ACTIVATED = "Yes" ]] && eval $COMMAND

}

Then to delete files older than "date", you can look at man find and focus on the mtime and newer options.

Edit: as said earlier, there is no particular interest in doing a local backup except a temproray file to be able send a tarball easily and delete it when sent.

Upvotes: 9

pgl
pgl

Reputation: 7981

You can do most of this with a one-line cronjob set to run every 30 minutes:

mysqldump -u<user> -p<pass> <database> > /path/to/dumps/db.$(date +%a.%H:%M).dump

This will create a database dump every 30 minutes, and every week it'll start overwriting the previous week's dumps.

Then have another cronjob that runs once a week that copies the most recent dump to a separate location where you're keeping snapshots.

Upvotes: 8

Related Questions