Reputation: 291
when i trying to read file using FileReader and the file size is 5.9gb and when this code run
var file = document.getElementById('uploadFileId').files[0];
let reader = new FileReader();
reader.onerror = function() {
console.log(reader.error);
}
reader.onload = function(e) {
console.log(" e.target.result ",e.target.result);
}
reader.readAsArrayBuffer(file);
then above error is generate in angularjs. here i want to achieve that multipart file want to divide in to 5mbs chunks and send to server.
Upvotes: 26
Views: 18303
Reputation: 136
This error only happens when you are changing the file after or during reading process
Upvotes: 1
Reputation: 1688
This seems related to the Chrome 2GB ArrayBuffer size limit (other browsers have higher limits).
One solution is to upload the file chunks and then save them all to a file on the server:
const writableStream = new WritableStream({
start(controller) { },
async write(chunk, controller) {
console.log(chunk);
// upload the chunks here
},
close() { },
abort(reason) { },
});
const stream = e.target.files[0].stream();
stream.pipeTo(writableStream);
Upvotes: 4
Reputation: 131
I'm getting the same message, but only for files over 2GB. Seems as though there is a file size limit that triggers this unhelpful message.
Upvotes: 10