Reputation: 325
I have an endpoint that takes in form data including a file. This file can be a text file, image, or pdf. I'm using busboy (v0.2.14) to parse the form data. That code looks like this:
let buffers = [];
file.on('data', data => buffers.push(data));
file.on('end', () => {
result.filename = filename;
result.contentType = mimetype;
// Concat the chunks into a Buffer
result.file = new Buffer.concat(buffers);
});
// ...
busboy.write(event.body, event.isBase64Encoded ? 'base64' : 'binary');
busboy.end();
However, when I push the file data up to S3 using the AWS SDK (v2.97.0), all the binary files are corrupted when I go to view them. This does not happen to text files. The S3 upload code looks like this:
static myPutObject(bucketName, fileName, data, contentType, acl) {
const params = {
Bucket: bucketName,
Key: fileName,
Body: data,
ACL: acl,
ContentType: contentType,
ContentEncoding: 'base64'
};
return new AWS.S3().putObject(params).promise();
}
I've tried everything that I can find on Stack Overflow or GitHub with no luck.
Upvotes: 5
Views: 2152
Reputation: 529
If you're using API gateway in the front. apiGateway will mangle the incoming binary unless you specifically enabled binary Media Types. If you’re using SLS to deploy, then you can just add:
apiGateway:
binaryMediaTypes:
- '*/*'
in the provider section
Read here: https://serverless.com/framework/docs/providers/aws/events/apigateway#binary-media-types
Upvotes: 4
Reputation: 13055
S3 is an "object in" and "object out" store. It does not know whether your content is binary or text or utf-16 encoding. It stores all the bytes as it receives and serves them when requested.
Here is how we validated whether the problem is on S3 or with our code.
Hope it helps.
Upvotes: 1