Reputation: 503
I'm looking through the documentation of aws cli and I cannot find the way to copy the only files in some directory structure to other bucket with "flattened" structure(I want one directory and all files inside of it).
for example
/a/b/c/1.pg
/a/2.jpg
/a/b/3.jpg
i would want to have in different bucket:
/x/1.jpg
/x/2.jpg
/x/3.jpg
Am I missing something or is it impossible? Do you have an idea how could I do that?
Upvotes: 4
Views: 5527
Reputation: 1
This worked for me:
$foo = aws s3 ls s3://blah/blah --recursive | Select-String -Pattern "somePattern" | ConvertFrom-String
$foo | % {
$f = $_.P4
$fileToCopy = "s3://blah/$f"
$fileToCopy
aws s3 cp $fileToCopy .
}
Upvotes: 0
Reputation: 309
Here are some examples for your reference:
aws s3 sync /a/b/c/1.pg s3://bucketname/
aws s3 sync /a/2.jpg s3://bucketname/
aws s3 sync /a/b/3.jpg s3://bucketname/
To sync all contents of a dir to S3 bucket:
aws s3 sync /directoryPath/ s3://bucketname/
AWS reference url: http://docs.aws.amazon.com/cli/latest/reference/s3/sync.html
Upvotes: -1
Reputation: 5649
You don't need to download files locally, as suggested in another answer. Instead, you could write a shell script or something that does the following:
s3://bucket1
to get fully-qualified names of all files in it.s3://bucket2/x/
Upvotes: 1
Reputation: 1324
Assuming that you have aws cli configured on the system and assuming that both the buckets are in the same region. What you can do is first dowload the s3 bucket to your local machine using:
aws s3 sync s3://originbucket /localdir/
Post this, use a find command to get all the files into one dir
find /localdir/ -type f -exec mv {} /anotherlocaldir/
Finally, you can upload the files to s3 again!
aws s3 sync /anotherlocaldir/ s3://destinationbucket
Upvotes: 3