Reputation: 630
I am trying to do this:
<input type="file">
and <input type="button" onClick="...">
file[0]
from the <input type="file">
's value.Fetch
api to call on my AWS API-Gateway Post
API.I have tried both multipart/form-data
and application/json
in the request.
I have tried using Buffer(body, "utf8")
, Buffer(body, "base64")
, Buffer(body, "binary")
.
S3
for storage using putObject
.Problem:
The file object that was set up in S3
has either filesize 0 or some random value that's not matching the original filesize.
The file downloaded from S3
cannot be opened.
Failed Approaches
I have considered multer
and multiparty
. These seem to be ExpressJS
middleware and expects a HttpRequest
object as input (this wasn't mentioned explicitly. It seems like I am ignorant enough not to assume that these can only work with HttpRequest
object and it took me a while to find out). I am also ignorant enough not to know how to transform an AWS event object into a HttpRequest object. But that said, it seems excessive to use Express engine just for the sake of managing file upload.
Exposing my S3
bucket as public-read-write
seems to be rather insecure. So I am not considering the frontend directly moving stuff in/out of my S3
bucket.
My Request
Can anyone tell me how to get this to work? And/or an alternative to this approach?
Upvotes: 2
Views: 3616
Reputation: 630
This works for me:
Frontend
Specifically encoding File
Object to base64
string. (How to convert file to base64 in JavaScript?)
Examining the encoded string-date of the file, you will find that it has a ;
delimited header. This is also not explicitly mentioned anywhere. Use .split()
and .join()
to remove the header.
Now you can compose your request in whatever form you want. I took the easy way out and used JSON.
Here's some code that do what I have mentioned:
const getBase64fromFile = (file) => {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => {
console.log(`getBase64fromFile success.`);
const spliced = reader.result.split(',');
const header = spliced[0];
spliced.shift();
resolve({
header: header,
body: spliced.join('')
});
};
reader.onerror = (err) => {
console.log(`getBase64fromFile failed.`);
reject(err);
};
});
}
uploadHandler() {
console.log(this.state.selectedFile);
const selectedFile = this.state.selectedFile;
const email = this.state.email;
return getBase64fromFile(selectedFile)
.then((base64Data) => {
return {
name: selectedFile.name,
header: base64Data.header,
base64: base64Data.body,
email: email
}
})
.then((body) => {
console.log(`${JSON.stringify(body)}`);
return body;
})
.then((body) => {
this.setState({status: "Begin uploading..."});
return fetch(this.state.url+"/upload",
{ // Your POST endpoint
method: 'POST',
headers: {
"Authorization": "Bearer " + this.state.token,
"x-api-key": this.state.apikey,
"Accept": "application/json",
},
body: JSON.stringify(body)
});
})
.then(
response => response.json() // if the response is a JSON object
)
.then(
success => {
console.log(success); // Handle the success response object
this.setState({
status: success.message
});
}
)
.catch(
error => {console.log(error) ;// Handle the error response object
this.setState({
status: JSON.stringify(error)
});
})
;
}
(https://github.com/flameoftheforest/yaUserMan/blob/master/Tests/frontend/src/App.js)
AWS-Lambda
Now we know that the data is coming in as a base64 string, convert it to octet using Buffer.from(body, "base64")
.
The output of Buffer.from()
is the <binary string>
that putObject
requires.
Here's some code that does what I have mentioned:
const params = {
Bucket: process.env.IMAGE_BUCKET,
Key: uuid() + event.body.name,
Body: Buffer.from(event.body.base64, 'base64'),
ACL: "public-read"
};
(https://github.com/flameoftheforest/yaUserMan/blob/master/yaUserMan/file2S3Helper.js)
Upvotes: 1