Reputation: 167
we have an ERP system that uses SQL Server DB, which generates DB backup files on a Windows PC every day.
Our manager hope to realize that the DB backup file of the day can be uploaded and saved to an AWS S3 bucket automatically and regularly every day. We have already registered an AWS account for this and created a bucket specifically. Considering the storage cost of AWS S3, it is expected that only the ten most recent DB backup files should be kept in the bucket.
It seems that we need AWS S3 CLI and some script coding to realize this requirement. Should it be implemented in .bat code, or Windows Power Shell scripting code? How to check whether there are ten DB backup files in the S3 bucket already? If yes, how to delete the oldest one prior to uploading the latest one?
Is there any similar sample code? Thanks!
Upvotes: 0
Views: 915
Reputation: 19
You can create a cron job for this. Also you need to install aws cli. If you are hosting your DB in linux then this code might help you:
#!/bin/bash
NOW=$(date +"%Y-%m-%d")
NOW_TIME=$(date +"%Y-%m-%d %T %p")
NOW_MONTH=$(date +"%Y-%m")
MYSQL_HOST="localhost"
MYSQL_PORT="3306"
MYSQL_DATABASE="wordpress"
MYSQL_USER="user"
MYSQL_PASSWORD="password"
BACKUP_DIR="/home/mkyong/backup/$NOW_MONTH"
BACKUP_FULL_PATH="$BACKUP_DIR/$MYSQL_DATABASE-$NOW.sql.gz"
AMAZON_S3_BUCKET="s3://mkyong/backup/linode/mysql/$NOW_MONTH/"
AMAZON_S3_BIN="/home/mkyong/.local/bin/aws"
#################################################################
mkdir -p ${BACKUP_DIR}
backup_mysql(){
mysqldump -h ${MYSQL_HOST} \
-P ${MYSQL_PORT} \
-u ${MYSQL_USER} \
-p${MYSQL_PASSWORD} ${MYSQL_DATABASE} | gzip > ${BACKUP_FULL_PATH}
}
upload_s3(){
${AMAZON_S3_BIN} s3 cp ${BACKUP_FULL_PATH} ${AMAZON_S3_BUCKET}
}
backup_mysql
upload_s3
Upvotes: 1