Chad Brown
Chad Brown

Reputation: 1667

Upload to Amazon S3 using API for node.js Extremely Slow

Been trying to figure out why uploading to Amazon S3 is amazingly slow using the putObject command (node.js library). The code below reads an entire directory of files and puts them to S3 asynchronously.

//Read a directory of files
fs.readdir(dir,function(err,files){

    //Read each file within the folder
    for(var i=0; i < files.length; i++){

    var file = files[i];

    //Read the File
    fs.readFile(path.join(dir,file), function(err, data){

        //Create a new buffer
        var buffer = new Buffer(data, 'base64');

        //Add the pdf to S3 
        s3.putObject({
        'Bucket':bucket,
        'Key':path.join(key,file),
        'Body':buffer,
        'ContentType':mime(file)
        },function(err, data) {

          //Wait for all the other files to be done
          // and perform a callback
        });
    });
    }
});

Tested with a number of different folders with similar results.

Uploading the same files using the AWS web interface takes around 3 sec to complete (or less). Why is using the node.js API so slow??

As per Amazon documentation I've even tried spawning multiple children to handle each upload independently. No changes in upload speed.

Upvotes: 3

Views: 4366

Answers (3)

Prakhar Dev Gupta
Prakhar Dev Gupta

Reputation: 11

It may be very late to answer this question but have you tried enabling Transfer Acceleration for your bucket? It incurs some extra charge but definitely improves the upload/download speed.

Read here: Enable Transfer Acceleration for your S3 Bucket

Once you enable this in your bucket, then in your S3 client you can add the parameter: {useAccelerateEndpoint: true}

Upvotes: 0

Rohit Gupta
Rohit Gupta

Reputation: 1378

I had the same requirement to upload multiple files, so I had leveraged the Promise feature to upload files parallelly.

    async function uploadFile(filePath, bucket) {
    
        const fileContent = fs.readFileSync(filePath);
        const params = {
            Bucket: bucket,
            Key: path.basename(filePath),
            Body: fileContent
        };
    
        return await s3.upload(params).promise();
    }
    
    let uploadFilePromise = [];
        
    for (const file of fileList) {
        uploadFilePromise.push(uploadFile(file, bucket));
    }
    await Promise.all(uploadFilePromise);

It had reduced the overall upload time a lot.

Upvotes: 0

tom f
tom f

Reputation: 389

Did you set the proper region when you created a new S3 instance in Node?

Say for example, your s3 bucket is in us-east-1. For optimal transfer speeds you'd want to make sure your S3 instance was set to that region, like:

const s3 = new AWS.S3({
        accessKeyId: "xxx",
        secretAccessKey: "xxx",
        region: 'us-east-1'
    });

Otherwise it can be incredibly slow. Someone can probably chime in for the specific reasons why this happens--- I'd guess it has to do with having to keep looking up the actual region while doing multi-part requests, or possibly uploading to another region that's much further away from your destination region.

Upvotes: 3

Related Questions