Reputation: 623
I am writing a Lambda function to create a text file and upload to a specific S3 directory.
import boto3
import os
def lambda_handler(event, context):
src_bucket = event['Records'][0]['s3']['bucket']['name']
filepath = event['Records'][0]['s3']['object']['key']
head, tail = os.path.split(filepath)
new_head = head.replace("/", "").upper()
new_filename = "_".join((new_head, tail))
s3_client = boto3.client('s3')
s3 = boto3.resource('s3')
string = "dfghj"
encoded_string = string.encode("utf-8")
file_name = "_".join(('ErrorLog', os.path.splitext(new_filename)[0]+'.txt'))
print(file_name)
s3_path = (head + '/errorlog/') + file_name
print(s3_path)
s3.Bucket(src_bucket).put_object(Key=s3_path, Body=encoded_string)
It executes without any errors. However, it goes into an infinite loop and keeps on creating subfolders with errorlog/filename.
For example, if the file_name = "ErrorLog_test1.txt" and s3_path = "folder1/errorlog/ErrorLog_test1.txt", it keeps creating subfolders as "errorlog" inside itself with the filename. Like "folder1/errorlog/errorlog/errorlog/ErrorLog_test1.txt"
How do I stop it from creating recursive folders? I believe I am doing something wrong in setting up the s3_path object.
Upvotes: 2
Views: 6765
Reputation: 270144
It appears you have configured an Event on the Amazon S3 bucket to trigger the Lambda function when an object is created.
When an object is created, the Lambda function is triggered. The Lambda function creates an S3 object. This triggers and event, which triggers the Lambda function, which creates an object... etc. Yes, it is an infinite loop until some limit is reached.
I'm not sure what you are doing with the objects, but the safest method would be to configure the Event to only trigger for a given Path (sub-folder). Then, configure the Lambda function to create the new file in a different path so that it does not trigger the Event again.
Upvotes: 4