Reputation: 97
We have a React/Node app hosted on Digital Ocean. We're also utilizing Digital Ocean spaces which is interoperable with AWS S3 for Object storage. Basically, the app is sort of an in-house dropbox. We have Admins who are able to create folders and upload content to these folders. We then have Clients who are able to login and download any files we allow them access to.
We're successfully able to upload all files to Digital Ocean Spaces. No matter how large/small they are.
The problem is when we try to download (as Admin or Client) any content that is over 100MB in size we experience a JavaScript Heap out of memory error. This error appears on the Backend of the system.
Some solutions we've attempted to administer are:
Frontend code
downloadFile = (id, name, type) => {
axios
.get(
`/test-download/${id}`,
this.props.handleSnackBar(
"Your download has been started. Please wait."
)
)
.then(res => {
download(
new Blob([new Uint8Array(res.data.data.Body.data)]),
`${name}.${type}`
);
console.log(res);
console.log(res.data.data.Body),
this.props.handleSnackBar("Your download is now ready.");
})
.catch(err => console.log(err));
};
Backend code
app.get("/test-download/:id", (req, res) => {
var params = {
Bucket: bucketName,
Key: req.params.id
};
s3.getObject(params, function(err, data) {
//
console.log(data);
//
if (!err) {
res.send({ data, key: params.Key });
} else {
console.log({ err }); // an error occurred
}
});
});
Backend code with stream
app.get("/test-download/:id", (req, res) => {
var params = {
Bucket: bucketName,
Key: req.params.id
};
// TRY
const fileRequest = s3.getObject(params);
let chunks = [];
fileRequest
.createReadStream()
.on("data", function(data) {
console.log(`Received ${data.length} bytes of data`);
chunks.push(data);
})
.on("end", function() {
console.log("no more data");
bufferData = Buffer.concat(chunks);
console.log(bufferData);
res.send({ bufferData, key: params.Key });
});
});
So, basically I'm sort of stuck. Any assistance that can be offered is greatly appreciated. Thanks.
Upvotes: 2
Views: 2598
Reputation: 97
Thanks to Marcos, I revisited the piping code we had attempted. But now fully understanding the raw data response I was receiving from the createReadStream().pipe()
I was able to convert the data.
Frontend code
downloadFile = (id, name, type) => {
axios
.get(
`/test-download/${id}`,
{ responseType: "arraybuffer" },
this.props.handleSnackBar(
"Your download has been started. Please wait."
)
)
.then(res => {
console.log(res);
download(res.data, `${name}.${type}`);
this.props.handleSnackBar("Your download is now ready.");
})
.catch(err => console.log(err));
};
Backend code
app.get("/test-download/:id", (req, res) => {
var params = {
Bucket: bucketName,
Key: req.params.id
};
s3.getObject(params)
.createReadStream()
.pipe(res)
.on("finish", () => {
console.log("** done");
});
});
Upvotes: 2
Reputation: 40404
The issue is that while you're using streams
on the last snippet, you buffer all the chunks defeating the purpose of using streams.
What you should do instead is .pipe
directly to the response, this way the memory used will be quite low.
app.get("/test-download/:id", (req, res) => {
const params = {
Bucket: bucketName,
Key: req.params.id
};
s3.getObject(params)
.createReadStream()
.pipe(res);
});
Have in mind that now you're not responding a JSON
object, so the client should be changed.
Upvotes: 1