Hexie
Hexie

Reputation: 4221

API Gateway GET / PUT large files into S3

Following this AWS documentation, I was able to create a new endpoint on my API Gateway that is able to manipulate files on an S3 repository. The problem I'm having is the file size (AWS having a payload limitation of 10MB).

I was wondering, without using a lambda work-around (this link would help with that), would it be possible to upload and get files bigger than 10MB (even as binary if needed) seeing as this is using an S3 service as a proxy - or is the limit regardless?

I've tried PUTting and GETting files bigger than 10MB, and each response is a typical "message": "Timeout waiting for endpoint response".

Looks like Lambda is the only way, just wondering if anyone else got around this, using S3 as a proxy.

Thanks

Upvotes: 12

Views: 21010

Answers (2)

Ka Hou Ieong
Ka Hou Ieong

Reputation: 6515

You can create a Lambda proxy function that will return a redirect link with a S3 pre-signed URL.

Example JavaScript code that generating a pre-signed S3 URL:

var s3Params = {
    Bucket: test-bucket,
    Key: file_name,
    ContentType: 'application/octet-stream',
    Expires: 10000
};

s3.getSignedUrl('putObject', s3Params, function(err, data){
   ...
}

Then your Lambda function returns a redirect response to your client, like,

{
    "statusCode": 302,
    "headers": { "Location": "url" }
}

You might be able to find more information you need from this documentation.

Upvotes: 12

Ashan
Ashan

Reputation: 19738

If you have large files, consider directly uploading them to S3 from your client. You can create a API endpoint to return a signed URL for the client to use for the upload (To Implement Access Control) your private content.

Also you can consider using multi-part uploads for even larger files to speed up the uploading.

Upvotes: 3

Related Questions