Reputation: 184
Once a file is uploaded to a s3 bucket, is there any way to trigger a copy of the same file to another s3 bucket with different directory structure ?
The two s3 buckets are under the same account.
One possible solution is - I can trigger this in the client which uploads to the s3 bucket. But the problem here is there are lot of clients which upload the original bucket.
I want to know if aws has any such service for triggering such copy.
Upvotes: 3
Views: 5660
Reputation: 328
I know the question is old, but as I ended up here some other people might come here as well.
I think an option could be lambda functions (I think it wasn't available by the time this question was posted), where you can enable a trigger on a certain bucket, so that every time something is uploaded to that bucket this trigger will call a lamba function and do something. In yours situation you could set a trigger on your bucket to watch for uploads, once that trigger is called your lambda function should do something like 'get that file and upload again to the other bucket'.
Some Lambda docs: https://aws.amazon.com/pt/documentation/lambda/
And some Lambda with S3 docs: http://docs.aws.amazon.com/lambda/latest/dg/with-s3.html
I think a sample code could be something like:
const aws = require('aws-sdk');
const s3 = new aws.S3({ apiVersion: '2006-03-01' });
exports.handler = (event, context, callback) => {
const bucket = event.Records[0].s3.bucket.name;
const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' '));
const params = {
Bucket: bucket,
Key: key
};
s3.getObject = (params) => {
if(err){
console.log('Error found trying to get Object from S3: ', err);
}
else{
console.log('Get Object done ', data.Body);
const param_copy = {Bucket: 'your_copy_bucket', Key: 'your_copy_key', Body: data.Body};
s3.upload = (param_copy) => {
if (err) console.log('Problem on uploading: ', err);
console.log('done');
callback(null);
});
}
};
};
Upvotes: 5
Reputation: 179384
It won't be real-time, but the delay is only a few minutes -- enable logging on the first bucket; set up a job to download and parse those logs -- which will include the PUT requests -- and use this to know what files need to be synched. There is no built-in mechanism in S3 to sync buckets like this.
Upvotes: 1
Reputation: 2682
This answer can help you: Notification of new S3 objects
The summary is that, at this moment, there is no notification of new objects except for s3:ReducedRedundancyLostObject event. The official documentation is here: http://docs.aws.amazon.com/AmazonS3/latest/API/RESTBucketPUTnotification.html
The solution is to implement the logic in your code or poll the bucket.
Upvotes: 3