Reputation: 11
I have an S3 bucket with paths such as:
s3://mybucket/data/2017-01-01/raw/file_file1.txt
s3://mybucket/data/2017-01-01/raw/file_file2.txt
s3://mybucket/data/2017-01-01/filtered/file_file3.txt
s3://mybucket/data/2017-02-01/edited/file_file4.txt
Is there a way to move all files at the end of the directories (they all start with file_
) into:
s3://mybucket/data/
with a single command?
Upvotes: 1
Views: 3061
Reputation: 158
Recursively copying S3 objects to another bucket
When passed with the parameter --recursive
, the following cp
command recursively copies all objects under a specified bucket to another bucket while excluding some objects by using an --exclude
parameter. In this example, the bucket mybucket
has the objects test1.txt
and another/test1.txt
:
aws s3 cp s3://mybucket/ s3://mybucket2/ --recursive --exclude "mybucket/another/*"
Output:
copy: s3://mybucket/test1.txt to s3://mybucket2/test1.txt
You can combine --exclude
and --include
options to copy only objects that match a pattern, excluding all others:
aws s3 cp s3://mybucket/logs/ s3://mybucket2/logs/ --recursive --exclude "*" --include "*.log"
Output:
copy: s3://mybucket/test/test.log to s3://mybucket2/test/test.log
copy: s3://mybucket/test3.log to s3://mybucket2/test3.log
In your case it will be:
aws s3 cp s3://mybucket/data/ s3://mybucket/data/ --recursive --exclude "*" --include "file_*.txt"
Source: AWS CLI cp
command documentation
Upvotes: 1
Reputation: 1
I think you just have to do
aws s3 cp s3://mybucket/data/ s3://mybucket/data/ --recursive --include "file_file*.txt"
to copy it
Upvotes: 0