JoelParke
JoelParke

Reputation: 2736

How to properly upload an object to AWS S3?

After three days of chasing this problem. It's time to ask for some help. I have a signed url to upload a file to Amazon S3. I know this is correct, since a

curl -v -T file.png --header "Content-Type:binary/octet-stream" "https://sample.s3.amazonaws.com/33d7e8f0-0fc5-11e5-9d95-2b3410860edd?AWSAccessKeyId=XXXXXXXX&Content-Type=binary%2Foctet-stream&Expires=1433984735&Signature=LUjj8iIAbCfNoskGhqLDhuEWVG4%3D"

succeeds correctely.

But my .ajax code (below) which does the upload leaves the content garbled slightly in the S3 bucket.
For example if I upload a .pdf file, it loads properly from the S3 Management Console, but if it is a .png or .jpeg, etc. it fails... And looking closely the file has the wrong length (slightly).

The heart of the call in the browser is:

var formData = new FormData();
formData.append("upload", file);
$.ajax({ url: data.puturl,
         type: 'PUT',
         xhr: function() {
           var myXhr = $.ajaxSettings.xhr();
           if (myXhr.upload) {
             myXhr.upload.addEventListener('progress', progressHandlingFunction, 
                                           false);
           }
           return myXhr;
         },
         success: completeHandler,
         error: errorHandler,
         data: formData,
         cache: $.param(false),
         contentType: "binary/octet-stream",
         processData: $.param(false)
         }, 'json');

This almost seems to work, but the data is garbled. I have tried setting the content to file.type, etc. to no avail. Is there some encoding that I need to do here? Such as base64 that I am missing????

Any insight would be greatly appreciated.
OR if there is an easy way to not use .ajax to do the same thing, that would be great.

Upvotes: 0

Views: 1460

Answers (1)

JoelParke
JoelParke

Reputation: 2736

From the question that 'tedder42' asked above, and some more experimentation, I realized that sending the FormData was the issue. So I changed the code to just use a FileReader() and pass the raw data. This works perfectly. Here is the code:

var reader = new FileReader();
reader.readAsArrayBuffer(file);
reader.onload = function (e) {
  var rawData = reader.result;
  $.ajax({ url: data.puturl,
    type: 'PUT',
    xhr: function() {
      var myXhr = $.ajaxSettings.xhr();
      if (myXhr.upload) {
        myXhr.upload.addEventListener('progress', progressHandlingFunction, false);
      }
      return myXhr;
    },
    success: completeHandler,
    error: errorHandler,
    // Form data
    data: rawData,
    cache: $.param(false),
    contentType: "binary/octet-stream",
    processData: $.param(false)
  }, 'json');
};

This is much more straightforward and everything works perfectly. Then when the data is downloaded later using a signed url, I simply put

ResponseContentDisposition: "attachment; filename="+fileData.name

as one of the params in the s3.getSignedUrl('getObject', params) call.

AND in the presigned url to retrieve the file, I put

ResponseContentDisposition: "attachment; filename="+fileData.name, 'ResponseContentType': fileData.type 

which takes care of ensuring that the browser expects what it is receiving.

Because you cannot set the content-type when uploading the object when using a pre-signed url, I also added code on the server side to change the object content-type after the upload has completed. Here is the heart of the code:

var params = {
  Bucket: 'sample',
  CopySource: 'sample/' + file.key,
  Key: file.key,
  MetadataDirective: 'REPLACE',
  ContentType: type
};
s3.copyObject(params, function (err, data){
  if (err) {
    console.log(err);
    validationError(res, 'unable to change content-type!');
  }
  res.sendStatus(200);
});

This was a pain to finally get right and I am hopeful that this will help others!

Upvotes: 1

Related Questions