Reputation: 553
I have created this function to pull objects from my S3 Bucket. It works but because of the -Key parameter I can only do one file at a time.
Is there anyway to back up the entire contents of the bucket without writing multiple Copy-S3Object cmdlets?
function CopyFromS3ToFolder($S3_Bucket, $S3_Folder_Destination, $S3_Key, $S3_SecretKey, $S3_AccessKey, $S3_Region)
{
#http://docs.aws.amazon.com/powershell/latest/reference/Index.html (Amazon Simple Storage Service)
#version AWSToolsAndSDKForNet_sdk-2.0.11.0-ps-2.0.11.0-tk-1.6.5.2
Write-Host "Copying from S3 to Local Directory"
Write-Host "Folder Name :$S3_Folder_Destination"
Copy-S3Object -BucketName $S3_Bucket -LocalFile $S3_Folder_Destination -SecretKey $S3_SecretKey -AccessKey $S3_AccessKey -Region $S3_Region -Key $S3_Key
}
Upvotes: 3
Views: 548
Reputation: 553
$objects = Get-S3Object -bucketname $S3_Bucket -SecretKey $S3_SecretKey -AccessKey $S3_AccessKey -Region $S3_Region -KeyPrefix 8.9.2014
foreach($key in $objects.key)
{
$filename = $key -replace "8.9.2014/"
Copy-S3Object -Bucket $S3_Bucket -Key $key "$S3_Folder_Destination\$filename" -SecretKey $S3_SecretKey -AccessKey $S3_AccessKey -Region $S3_Region
}
see: https://forums.aws.amazon.com/thread.jspa?messageID=441291
Upvotes: 2
Reputation: 301147
You can use Get-S3Object
to get the keys you want, and then pipe that to Copy-S3Object
, something like below:
Get-S3Object -bucketname "Bucket1" | %{ Copy-S3Object -Key $_.key -LocalFile "path" }
Upvotes: 2