Suprido
Suprido

Reputation: 533

FileReader fails reading big blobs

I've encountered a problem with FileReader and reading quite big Blobs.

const size = 50; //MB
const blob = new Blob([new ArrayBuffer(size*1024*1024)], {type: 'application/octet-string'});

console.log(blob.size);

const reader = new FileReader();
reader.onload = function(e) {
    console.log(new Uint8Array(e.target.result));
};
reader.readAsArrayBuffer(blob.slice(0, 1024));

https://jsfiddle.net/aas8gmo2/

The example above shows that onload function is not called every time (if it is, increase size of the Blob to 100/200/300 MB). The problem is reproducible only under Chrome (tested under 53.0.2785.143)

Any hints what could be wrong?

Upvotes: 1

Views: 512

Answers (1)

nioKi
nioKi

Reputation: 1289

Last time I used Chrome, there was a hard cap of around 500mb for a single blob size.

Also, given these threads: https://bugs.chromium.org/p/chromium/issues/detail?id=375297 and https://github.com/streamproc/MediaStreamRecorder/issues/86

It appears that memory is not properly cleared when creating several small blobs and you might need to reload the page to be able to go on. (That would also explain why several tries might be needed on the JSFiddle).

So for now, deceiving answer, but it seems you're gonna have to find a workaround...or dive into Chrome's source code.

Upvotes: 1

Related Questions