Reputation: 165
I upload a pdf
blob
object to S3
const params = {
Bucket: "poster-print-bucket",
Key: Date.now().toString() + ".pdf",
Body: blob,
contentType: "application/pdf",
};
const uploaded = await S3.upload(params).promise();
When I open a url which is i.e https://poster-print-bucket.s3.ca-central-1.amazonaws.com/1633526785678.pdf It downloads me a blank pdf
I thought maybe my blob
is corrupted or something but I managed to upload same blob
to firebase storage just fine.
btw I'm using nextjs api/upload-poster
route
What's happening?
Upvotes: 1
Views: 2521
Reputation: 49
The problem comes from the AWS API Gateway configuration which is not configured by default to pass binary data.
By following these resources, I found a solution:
If you are using cdk:
/* "this" refers to your stack object */
const api = new RestApi(this, `my-api-gateway-name`, {
/* ... your rest of configuration */
binaryMediaTypes: ['*/*'],
});
const integration = new LambdaIntegration(handler, {
/* ... your rest of configuration */
contentHandling: ContentHandling.CONVERT_TO_BINARY,
passthroughBehavior: PassthroughBehavior.WHEN_NO_MATCH,
});
api.root.addProxy({
defaultIntegration: integration,
anyMethod: true,
});
If you are not using cdk go to API Gateways, find your api and set these configuration there.
Upvotes: 2
Reputation: 1449
Using the AWS SDK v3 (up-to-date at the time of this post), you could use PutObjectCommand which accepts a Uint8Array as Body
params (docs).
Convert your Blob instance to an ArrayBuffer (docs), and your ArrayBuffer to an Uint8Array.
Code would look like:
const { S3Client, PutObjectCommand } = require('@aws-sdk/client-s3');
const client = new S3Client(/* config */);
const arrayBuffer = await blob.arrayBuffer();
const typedArray = new Uint8Array(arrayBuffer);
await client.send(new PutObjectCommand({
Bucket: /* ... */,
Key: /* ... */,
Body: typedArray,
}));
Upvotes: 2
Reputation: 165
I spent more time fixing this issue than I would like to admit. Here is the solution:
Frontend (converting blob to base64 before sending to backend):
function toBase64(blob) {
const reader = new FileReader();
return new Promise((res, rej) => {
reader.readAsDataURL(blob);
reader.onload = function () {
res(reader.result);
};
});
}
toBase64(currentBlob)
.then((blob) => {
return axios
.post("/api/upload-poster", blob, {
headers: {
"Content-Type": "application/pdf",
},
})
.then(({ data }) => data.uploaded.Location);
})
Backend:
const base64 = req.body;
const base64Data = Buffer.from(base64.replace(/^data:application\/\w+;base64,/, ""), "base64");
const params = {
Bucket: "poster-print-bucket",
Key: nanoid() + ".pdf",
Body: base64Data,
ContentEncoding: "base64",
contentType: "application/pdf",
};
const uploaded = await S3.upload(params).promise();
Why this all song and dance is required? Can it be something easier?
Upvotes: 2