Jeff Voss
Jeff Voss

Reputation: 3695

Stream File Directly to s3 using NodeJS+Express, aws-sdk

I want to upload some large files directly to s3 via the browser with NodeJS, it is unclear how to prepare this file for upload to s3. There might be a better module (like Knox) to handle this case but I am not sure. Any thoughts?

File Object

  file: { 
     webkitRelativePath: '',
     lastModifiedDate: '2013-06-22T02:43:54.000Z',
     name: '04-Bro Safari & UFO! - Animal.mp3',
     type: 'audio/mp3',
     size: 11082039 
  }

S3 putObject

var params = {Bucket: 'bucket_name/'+req.user._id+'/folder', Key: req.body['file']['name'], Body: ???};
s3.putObject(params, function(err, data) {
    if (err)
      console.log(err);
    else
      console.log("Successfully uploaded data to myBucket/myKey");
});    

Upvotes: 13

Views: 26945

Answers (5)

Shree Harsha S
Shree Harsha S

Reputation: 685

The v3, the PutObjectCommand can not write file stream to S3. We need to use the @aws-sdk/lib-storage library for uploading buffers and streams.

Example:

const upload = async (fileStream) => {
    const uploadParams = {
        Bucket    : 'test-bucket',
        Key    : 'image1.png',
        Body: fileStream,
    }

    try {
        const parallelUpload = new Upload({
            client: s3Client,
            params: uploadParams,
        });

        console.log('Report progress..')
        parallelUpload.on("httpUploadProgress", (progress) => {
            console.log(progress);
        });

        await parallelUpload.done();
    } catch (e) {
        console.log(e);
    }
}

Ref - https://github.com/aws/aws-sdk-js-v3/blob/main/UPGRADING.md#s3-multipart-upload

Upvotes: 0

Streaming is now supported (see docs), simply pass the stream as the Body:

var fs = require('fs');
var someDataStream = fs.createReadStream('bigfile');
var s3 = new AWS.S3({ params: { Bucket: 'myBucket', Key: 'myKey' } });
s3.putObject({ Body: someDataStream, ... }, function(err, data) {
  // handle response
})

Upvotes: 19

Ankur Sanghi
Ankur Sanghi

Reputation: 266

One option is to use multer-s3 instead: https://www.npmjs.com/package/multer-s3.

This post has some details also: Uploading images to S3 using NodeJS and Multer. How to upload whole file onFileUploadComplete

Upvotes: 1

Peter Lyons
Peter Lyons

Reputation: 145994

Your code isn't streaming. You need to see a call to pipe somewhere or at least code to pipe by hand by using data event handlers. You are probably using the express bodyParser middleware, which is NOT a streaming implementation. It stores the entire request body as a temporary file on the local filesystem.

I'm not going to provide specific suggestions because of the promising results I got from a web search for "node.js s3 stream". Spend 5 minutes reading, then post a snippet that is at least an attempt at streaming and we can help you get it right once you have something in the ballpark.

Upvotes: 0

hexacyanide
hexacyanide

Reputation: 91599

The s3.putObject() method does not stream, and from what I see, the s3 module doesn't support streaming. However, with Knox, you can use Client.putStream(). Using the file object from your question, you can do something like this:

var fs = require('fs');
var knox = require('knox');

var stream = fs.createReadStream('./file');
var client = knox.createClient({
  key: '<api-key-here>',
  secret: '<secret-here>',
  bucket: 'learnboost'
});

var headers = {
  'Content-Length': file.size,
  'Content-Type': file.type
};

client.putStream(stream, '/path.ext', headers, function(err, res) {
  // error or successful upload
});

Upvotes: 3

Related Questions