RK.
RK.

Reputation: 607

How to create a log and export to S3 bucket by executing a Python Lambda function

From my Lambda Python code, I'm trying to create a log to my S3 bucket (s3://my_bucket/logs/) and throwing an error. It is working fine when I execute outside & generating logs. Error is below:

**[Errno 2] No such file or directory: '/var/task/s3:/my_bucket/logs/error.log': 
FileNotFoundError.**

When this line of code is encountered in my local environment, it is creating the log properly:

`LOGFILE_PATH = "D:\logs\error.log"`

And when i tried using Lambda after updating it to :

LOGFILE_PATH = "s3://my_bucket/logs/" . It is throwing the error when executed using Lambda function.

LOGFILE_PATH = "D:\logs\error.log"  -- working code in local exec.

It should generate the log in my S3 bucket. But it's not creating it. I doubt whether we can write logs to S3 from Lambda execution?

Thanks.

Upvotes: 0

Views: 2021

Answers (1)

danimal
danimal

Reputation: 1697

The notation you've used s3://my_bucket/logs/ is not a real address, it's a kind of shorthand, mostly only used when using the AWS CLI s3 service, that won't work in the same way as a URL or file system path; If you want to write to a bucket (instead of a local file) then from a python lambda you should probably be using boto3 and its s3 client to store the file - it also depends on what exactly you're doing in your code with the LOGFILE_PATH variable.

Upvotes: 3

Related Questions