williamsa
williamsa

Reputation: 1

upload files to tmp directory lambda

I have a Lambda function that triggers when an S3 upload happens. It then downloads to the /tmp and then sends to GCP Storage. Issue is that the logfiles can be up to 900 MB so there is not enough space on the /tmp storage in the Lambda function. Is there away around this?

I tried sending to memory but I believe the memory is read only. Also there is talk about mounting efs but not sure this will work.

retrieve bucket name and file_key from the S3 event

logger.info(event)

s3_bucket_name = event['Records'][0]['s3']['bucket']['name']
file_key = event['Records'][0]['s3']['object']['key']
logger.info('Reading {} from {}'.format(file_key, s3_bucket_name))

logger.info(s3_bucket_name)
logger.info(file_key)

# s3 download file
s3.download_file(s3_bucket_name, file_key, '/tmp/{}'.format(file_key))

# upload to google bucket
bucket = google_storage.get_bucket(google_bucket_name)
blob = bucket.blob(file_key)
blob.upload_from_filename('/tmp/{}'.format(file_key))

This the error from cloudwatch logs for the lambda function.

[ERROR] OSError: [Errno 28] No space left on device Traceback (most recent call last): File "/var/task/lambda_function.py", line 30, in lambda_handler s3.download_file(s3_bucket_name, file_key, '/tmp/

Upvotes: 0

Views: 696

Answers (1)

awfullyCold
awfullyCold

Reputation: 94

storage_client = storage.Client()
bucket = storage_client.get_bucket("YOUR_BUCKET_NAME")
blob = bucket.blob("file/path.csv") #file path on your gcs
blob.upload_from_filename("/tmp/path.csv") #tmp file 

i hope that will help you.

Upvotes: 1

Related Questions