ntonnelier
ntonnelier

Reputation: 1549

Upload image in local folder to S3

In my application I upload images to a local /tmp folder and make some transformations. Pictures are saved properly there. After that I want to upload this images to an S3 bucket, but so far I only manage to produce blank pictures.

This is my code:

//Pick the local image and make it binary
var fs = require('fs');
var bufferedData = '';
fs.readFile(imagePath, function (err, data) {
   if (err) { throw err; }
   bufferedData = new Buffer(data, 'binary');
}


//Send data to s3    
const uploadToS3 = async (idKey: string, modifiers: string, bufferedData) => {
  try {
    return await S3.upload({
      Bucket: 'mirage-thumbnails',
      Key: `${process.env.APP_ENV}/${idKey}/${modifiers}`,
      Body: bufferedData,
      ContentType: 'image/png',
      ACL: 'public-read',
      CacheControl: 'max-age=0',
    }).promise();
  } catch (e) {
    console.error(e.message, e);
  }
};

Upvotes: 2

Views: 1945

Answers (1)

Marcos Casagrande
Marcos Casagrande

Reputation: 40444

readFile is asynchronous, you need to wait until it finishes before uploading it to S3. But instead of using readFile you can provide a readable stream to s3.upload, which will allow you to upload big files without running out of memory, and make the code a little easier.

S3.upload({
    Bucket: 'mirage-thumbnails',
    Key: `${process.env.APP_ENV}/${idKey}/${modifiers}`,
    Body: fs.createReadStream(imagePath),
    ContentType: 'image/png',
    ACL: 'public-read',
    CacheControl: 'max-age=0',
}).promise();

In your code, bufferedData is not filled when uploadToS3 is called. You should wait until the file is read, and then call uploadToS3. The code should look like this:

const fs = require('fs');
const promisify = require('util').promisify;

// Promisify readFile, to make code cleaner and easier.
const readFile = promisify(fs.readFile);

const uploadToS3 = async(idKey, modifiers, data) => {
  return S3.upload({
    Bucket: 'mirage-thumbnails',
    Key: `${process.env.APP_ENV}/${idKey}/${modifiers}`,
    Body: data,
    ContentType: 'image/png',
    ACL: 'public-read',
    CacheControl: 'max-age=0',
  }).promise();
};

const uploadImage = async(path) => {
  const data = await readFile(imagePath);
  // Wait until the file is read
  return uploadToS3('key', 'modifier', data);
};

uploadImage('./some/path/image.png')
  .then(() => console.log('uploaded!'))
  .catch(err => console.error(err));

Using streams, just change uploadImage to:

const uploadImage = async(path) => {
      const stream = fs.createReadStream(path);
      return uploadToS3('key', 'modifier', stream);
};

Upvotes: 2

Related Questions