Reputation: 21
I have a fastify app with a route /api/upload
where I upload an image, using the @fastify/multipart
npm package.
The handler for this route basically use the stream of the file to upload directly to an AWS S3 bucket.
The following is the implementation I tried :
import { PutObjectCommand, S3Client } from '@aws-sdk/client-s3';
export const uploadImage = async (req, res) => {
/* get the uploaded file data */
const data = await req.file();
/* init the S3 client */
const S3 = new S3Client({
region: 'your-region',
credentials: {
accessKeyId: 'your-access-key',
secretAccessKey: 'your-secret-key'
}
});
/* create the command to add a new object to the S3 bucket */
const command = new PutObjectCommand({
Bucket: 'your-bucket-name',
Key: 'fileName.jpg',
Body: data.file /* THIS IS WHERE I GIVE THE FILE STREAM */
});
/* send the command */
await S3.send(command);
return res.code(200).send({
status: 'success'
});
};
When this handler is executed, an error is throw with the following :
{
"status": "error",
"typeCode": "SERVER_ERROR",
"message": "A header you provided implies functionality that is not implemented",
"error": {
"name": "NotImplemented",
"$fault": "client",
"$metadata": {
"httpStatusCode": 501,
"requestId": "5TCZZ4KZC5PQW2SY",
"extendedRequestId": "vzS1IWfX0thCd1PO30SnP1g1cBd0uWrHewCMLtvpHe+Ut2onCv27QgK2JauxwIeT7tTGmBGkSAo=",
"attempts": 1,
"totalRetryDelay": 0
},
"Code": "NotImplemented",
"Header": "Transfer-Encoding",
"RequestId": "5TCZZ4KZC5PQW2SY",
"HostId": "vzS1IWfX0thCd1PO30SnP1g1cBd0uWrHewCMLtvpHe+Ut2onCv27QgK2JauxwIeT7tTGmBGkSAo=",
"message": "A header you provided implies functionality that is not implemented",
"HTTPCode": 500,
"status": "error",
"typeCode": "SERVER_ERROR"
}
}
I don't understand why, I've looked all over the internet, but can't find the solution.
I tried uploading the same file to S3 by reading it as a stream with the native node.js module like this :
import fs from 'node:fs';
import { PutObjectCommand, S3Client } from '@aws-sdk/client-s3';
export const uploadImage = async (req, res) => {
/* read directly the file from disk */
const readStream = fs.createReadStream('path-to-file');
/* init the S3 client */
const S3 = new S3Client({
region: 'your-region',
credentials: {
accessKeyId: 'your-access-key',
secretAccessKey: 'your-secret-key'
}
});
/* create the command to add a new object to the S3 bucket */
const command = new PutObjectCommand({
Bucket: 'your-bucket-name',
Key: 'fileName.jpg',
Body: readStream /* THIS WORKS */
});
/* send the command */
await S3.send(command);
return res.code(200).send({
status: 'success'
});
};
And it works as expected.
I would really need some help, please.
Thank you for your time.
Upvotes: 0
Views: 1185
Reputation: 21
I've resolved my issue.
With this answer from stack overflow : Upgrade aws-sdk to version 3 - streaming S3 upload
It seems like it's a known bug.
What I did instead of the suggested answer is just :
import { Upload } from '@aws-sdk/lib-storage';
import { S3Client } from '@aws-sdk/client-s3';
export const uploadImage = async (req, res) => {
/* get the uploaded file data */
const data = await req.file();
/* init the S3 client */
const S3 = new S3Client({
region: 'your-region',
credentials: {
accessKeyId: 'your-access-key',
secretAccessKey: 'your-secret-key'
}
});
const upload = new Upload({
client: S3,
params: {
Bucket: 'your-bucket-name',
Key: 'fileName.jpg',
Body: data.file,
ContentType: 'text/plain'
},
});
await upload.done();
return res.code(200).send({
status: 'success'
});
};
Upvotes: 2