lynvie
lynvie

Reputation: 1038

Why is my S3 upload not uploading correctly?

I upload an image file using the following format:

var body = fs.createReadStream(tempPath).pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: myBucket, Key: myKey}});
var params = {
  Body: body,
  ACL: 'public-read',
  ContentType: 'image/png'
};

s3obj.upload(params, function(err, data) {
  if (err) console.log("An error occurred with S3 fig upload: ", err);
  console.log("Uploaded the image file at: ", data.Location);
});

The image successfully uploads to my S3 bucket (there are no error messages and I see it in the S3-console), but when I try to display it on my website, it returns a broken img icon. When I download the image using the S3-console file downloader I am unable to open it with the error that the file is "damaged or corrupted".

If I upload a file manually using the S3-console, I can correctly display it on my website, so I'm pretty sure there's something wrong with how I'm uploading.

What is going wrong?

Upvotes: 4

Views: 4229

Answers (2)

lynvie
lynvie

Reputation: 1038

I eventually found the answer to my question. I needed to post one more parameter because the file is gzip'd (from using var body = ...zlib.createGzip()). This fixed my problem:

var params = {
  Body: body,
  ACL: 'public-read',
  ContentType: 'image/png',
  ContentEncoding: 'gzip'
};

Upvotes: 6

Leah Zorychta
Leah Zorychta

Reputation: 13409

Theres a very nice node module s3-upload-stream to upload (and first compress) images to S3, here's their example code which is very well documented:

var AWS      = require('aws-sdk'),
    zlib     = require('zlib'),
    fs       = require('fs');
    s3Stream = require('s3-upload-stream')(new AWS.S3()),

// Set the client to be used for the upload. 
AWS.config.loadFromPath('./config.json');
// or do AWS.config.update({accessKeyId: 'akid', secretAccessKey: 'secret'});

// Create the streams 
var read = fs.createReadStream('/path/to/a/file');
var compress = zlib.createGzip();
var upload = s3Stream.upload({
  "Bucket": "bucket-name",
  "Key": "key-name"
});

// Optional configuration 
upload.maxPartSize(20971520); // 20 MB 
upload.concurrentParts(5);

// Handle errors. 
upload.on('error', function (error) {
  console.log(error);
});

/* Handle progress. Example details object:
   { ETag: '"f9ef956c83756a80ad62f54ae5e7d34b"',
     PartNumber: 5,
     receivedSize: 29671068,
     uploadedSize: 29671068 }
*/
upload.on('part', function (details) {
  console.log(details);
});

/* Handle upload completion. Example details object:
   { Location: 'https://bucketName.s3.amazonaws.com/filename.ext',
     Bucket: 'bucketName',
     Key: 'filename.ext',
     ETag: '"bf2acbedf84207d696c8da7dbb205b9f-5"' }
*/
upload.on('uploaded', function (details) {
  console.log(details);
});

// Pipe the incoming filestream through compression, and up to S3. 
read.pipe(compress).pipe(upload);

Upvotes: 0

Related Questions