JeffreyLazo
JeffreyLazo

Reputation: 853

Bash File to move files older than X to S3

I am trying to create a bash file to run via a cron job that copies, then deletes locally files/directories older than a certain date to Amazon S3 via their CLI.

I have the S3 CLI installed and working, just don't know how to write the script to copy and delete files.

Upvotes: 2

Views: 4980

Answers (3)

Avinash bhardwaj
Avinash bhardwaj

Reputation: 1

As discussed on the call please find the scenario-based question, you have to write a script to do the following and share the script :

Scenario - Upload all the files and directories in a drive older than a day to AWS and delete them from the drive

Upvotes: -1

Naveen
Naveen

Reputation: 687

This should do it, assuming you have your aws credentials setup.

#!/bin/sh

##############################################################################################################
#
# this script will move any log files that are older than 7 days to s3
#
##############################################################################################################

LOG_DIR=/some/log/dir
DAYS_TO_HOLD=7
NOW=`date +%Y%m%d`

echo "Starting log cleanup process ..."

find ${LOG_DIR} -name "*your_file_pattern_match_here*" -mtime +${DAYS_TO_HOLD} -exec aws s3 mv {} s3://somebucket-that-holds-logs \; >/dev/null 2>&1

echo "Log clean up completed"

Upvotes: 1

Vikas Tiwari
Vikas Tiwari

Reputation: 537

Presuming older on the basis of modification time

find <dir_where_files_reside> -mtime +X | while read line
do
aws s3 cp $line s3://<bucket_name>/
if [[ $? -eq 0 ]]; then
rm $line
fi
done

Upvotes: 2

Related Questions