Reputation: 199
I have a bucket in AWS S3. There are two folders in the bucket - folder1 & folder2. I want to copy the files from s3://myBucket/folder1 to s3://myBucket/folder2. But there is a twist: I ONLY want to copy the items in folder1 that were created after a certain date. I want to do something like this:
aws s3 cp s3://myBucket/folder1 s3://myBucket/folder2 --recursive --copy-source-if-modified-since
2020-07-31
Upvotes: 1
Views: 1823
Reputation: 132862
There is no aws-cli
command that will do this for you in a single line. If the number of files is relatively small, say a hundred thousands or fewer I think it would be easiest to write a bash script, or use your favourite language's AWS SDK, that lists the first folder, filters on creation date and issues the copy commands.
If the number of files is large you can create an S3 Inventory that will give you a listing of all the files in the bucket, which you can download and generate the copy commands from. This will be cheaper and quicker than listing when there are lots and lots of files.
Something like this could be a start, using @jarmod's suggestion about --copy-source-if-modified-since
:
for key in $(aws s3api list-objects --bucket my-bucket --prefix folder1/ --query 'Contents[].Key' --output text); do
relative_key=${key/folder1/folder2}
aws s3api copy-object --bucket my-bucket --key "$relative_key" --source-object "my-bucket/$key" --copy-source-if-modified-since THE_CUTOFF_DATE
done
It will copy each object individually, and it will be fairly slow if there are lots of objects, but it's at least somewhere to start.
Upvotes: 2