Reputation: 147
I want to create image manipulation on the image fetched from the AWS S3 and would like to perform manipulation action on it. I am using stream to solve the problem of loading big files.
import AWS from 'aws-sdk'
import sharp from 'sharp'
const s3 = new AWS.S3()
const transformer = (w, res, next) =>
sharp()
.resize(w)
.on('data', (data) => {
console.log(data)
res.write(data, 'binary')
})
.on('error', (err) => next(err))
.on('end', () => {
console.log('finished')
res.status(200).end()
})
const readStream = s3
.getObject({
Bucket: process.env.UPLOAD_BUCKET_NAME,
Key: 'test.jpg'
// Key: `${req.uid.uid}/${req.param('img')}`
})
.createReadStream()
const getImage = (w, res, next) => {
readStream.pipe(transformer(w, res, next))
readStream.on('error', (err) => next(err))
}
export default getImage
I am calling the getImage method on the route and this function serves images for the first time but when the second time the Image is requested, it throws Error: Input buffer contains unsupported image format
Upvotes: 5
Views: 15053
Reputation: 1
For those who get this error using apigateway, you might need to allow "binaryMediaTypes" on it. Here is the lines using aws cdk for my apigateway as a proxy for a fargate cluster.
const proxyApiGateway = new Apigateway.RestApi(this, "id-x", {
...,
defaultCorsPreflightOptions: {
allowOrigins: [<allowed urls>],
allowCredentials: true,
allowMethods: Apigateway.Cors.ALL_METHODS,
allowHeaders: Apigateway.Cors.DEFAULT_HEADERS,
},
binaryMediaTypes: ["multipart/form-data", "image/*", "application/pdf"],
})
Upvotes: 0
Reputation: 147
I found the solution.
const getImage = (req, res, next) => {
const w = parseInt(req.params.w)
const readStream = s3
.getObject({
Bucket: process.env.UPLOAD_BUCKET_NAME,
Key: 'test.jpg'
})
.createReadStream()
const transformer = (w) => sharp().resize(w)
pipeline(readStream, transformer(w), res, (err) => {
if (err) {
next(err)
}
})
}
export default getImage
It was acting like that because it was not scoped to the request.
Upvotes: 1