crazylearner
crazylearner

Reputation: 51

How to efficiently load large files into IndexedDB storage? My app is crashing at over 100,000 rows

I have a specific web application which relies on uploading a large number of rows of data from a local file on the client side. The aim is to store this data within an indexeddb.

The data only has two columns I am interested in, each containing a string of characters not longer than 25 characters, however there can be up to 1 million rows.

Reading a lot of questions and docs I have created code which seems to work in creating the indexeddb with smaller datasets below 20,000 rows, but breaks on larger data.

I'm sure this is due to poor design as I'm new to this style of work, or potentially some sort of freeze out in the chrome browser - as I don't receive any error messages, I can trigger an alert showing the last for loop is reached, however the on.complete never triggers and the database never seems to

The input of the function e - is a read file.

I also perform an operation on the data within the for loop, but I have removed this for simplicity.

function storeDataEnc (e) {
    var lines = e.target.result.split('\n');
    var request = self.indexedDB.open('DB', 1);
    request.onerror = function(e) {
        console.log("there was and error:" +e.target.errorCode);
    }
    request.onupgradeneeded = function(e){
        var db = request.result;
        var store = db.createObjectStore("col1", {
                    keyPath: "col2"} );
    };

    request.onsuccess = function(e) {

        var db = request.result;
        var tx = db.transaction("dataTable", "readwrite");

        var store = tx.objectStore("dataTable");

        db.onerror = function(e){
            console.log("ERROR" + e.target.errorCode);
        }


    for (var i = 0; i < lines.length; ++i) {
        var test = lines.length - 1;
        if (i == test) {console.log('nearly done')};

            function forEachLinenow (match) {
                if ( match.charAt( 0 ) != '#' ) {
                    match = match.trim();
                    var fields = match.split('\t');
                    var col1in = fields[0];
                    var col2in = fields[3];

                    store.put({ COL1: col1in, COL2: col2in              }
            }
        forEachLinenow(lines[i] + '\n');
    }
    tx.oncomplete = function() {
            db.close();
            alert("all data read");
        }
    }
}

I am guessing I do not understand some issue with the browser to stop malicious apps from taking up too many resources. Has anyone used data of this size, who can spot the error in my process.

My guess would be I may need to generate more than one transaction, which I did try, but didn't seem to change my issue.

I know this may be slow, however speed itself is not the biggest issue as long as the data is successfully ported in.

Upvotes: 3

Views: 4357

Answers (1)

Classified
Classified

Reputation: 222

You could be hitting the data size limits of the browser.

Here in mozilla docs it mentions the limits https://developer.mozilla.org/en-US/docs/Web/API/IndexedDB_API/Browser_storage_limits_and_eviction_criteria

And here are some more limits of indexeddb by documented google for popular browsers. https://developers.google.com/web/fundamentals/instant-and-offline/web-storage/offline-for-pwa

Seems the limits are all based on the available storage of the host os. Check the size of the data you are expecting to import and your available storage.

Upvotes: 4

Related Questions