Reputation: 686
I'm upload files from nodejs script and when I try to copy object I'm getting Access Denied error, if I try to delete the object or to get it - there is no problem and it success.
Is there anything spacial in the CopyObject?
params = {
"Bucket": "buacket-name",
"CopySource": "source-path/object.txt",
"Key": "source-path/object2.txt"
};
s3.copyObject(params, function(err, data)); // With a function for success or error.
Upvotes: 6
Views: 13272
Reputation: 1456
I had a similar issue to yours in that copyObject
produced a 403 "Access Denied" response, but a getObject
followed by a putObject
worked fine. In my case, I did have the correct syntax for the CopySource
parameter.
The solution to my problem was to add the s3:GetObjectTagging
and s3:PutObjectTagging
permissions to the IAM role performing the copy, since the copyObject operation will attempt to copy the tags over. I know this isn't the solution to your problem, but I'm putting this down answer in case someone else has something very similar.
Unfortunately, the documentation for the Node.JS API and the S3 service do not mention this permission requirement. I learned the solution in an SO answer.
Upvotes: 4
Reputation: 686
Solved it! The problem was in my CopySource path, it needs to have the bucket name first, like so: backet-name/objectkey
Upvotes: 21
Reputation: 137
Per the AWS S3 copyObject docs, the CopySource parameter should include the bucket and key names. For example:
var params = {
CopySource: 'source_bucket/source_key',
Bucket: 'destination_bucket_name',
Key: 'destination_key'
};
s3.copyObject(params, function(error, data) {
// error check
if (error) {
console.log(error, error.stack);
}
console.log('S3 object copied');
});
Upvotes: 12