Reputation: 162
In Python Django, I save multiple video files.
Save 1:
Save 2:
Save 3:
I have a lambda trigger that uses media converter to add HLS formats to these videos as well as generate thumbnails. These 3 saves are done in very short time periods between each other since they are assets to a Social Media Post object.
For some reason the S3 triggers for only some of the files.
Save 1 triggers S3 Lambda but not Save 2. Save 3 also triggers S3 Lambda.
My assumption is that the S3 trigger has some sort of downtime in between identifying new file uploads (In which case, I think the period in between these file uploads are near instant).
Is this assumption correct and how can I circumvent it?
Upvotes: 7
Views: 1835
Reputation: 269131
It should fire for all objects.
When Amazon S3 triggers an AWS Lambda function, information about the object that caused the trigger is passed in the events
field:
{
"Records": [
{
"eventSource": "aws:s3",
"awsRegion": "us-west-2",
"eventTime": "1970-01-01T00:00:00.000Z",
"eventName": "ObjectCreated:Put",
"s3": {
"bucket": {
"name": "my-s3-bucket",
"arn": "arn:aws:s3:::example-bucket"
},
"object": {
"key": "HappyFace.jpg",
"size": 1024,
...
}
}
}
]
}
Note that this is an array, so it is possible that multiple objects could be passed to one Lambda function. I have never definitively seen this happen, but the sample code from AWS certainly assumes this can happen based on their sample code:
def lambda_handler(event, context):
for record in event['Records']: # <-- Looping here
bucket = record['s3']['bucket']['name']
key = unquote_plus(record['s3']['object']['key'])
...
Therefore, I would recommend:
event
at the start of the function to put it into the log for later examinationUpvotes: 6