madmanul
madmanul

Reputation: 420

Nodejs memory leak when creating database

The following code exports some information from files to MongoDb.

categories.forEach(function(category){
var path = dirpath + '/' + directory + '/category_' + category.id + '.csv';

var readStream = fs.createReadStream(path);
var readLine = readline.createInterface({
    input: readStream
});

var items = [];

(function(items,readLine){
    readLine.on('line', function (line) {
        items.push(mapItems(line.split(';')));

        if(items.length > 10000) {
            saveItems(items); //save chunk
            items.length = 0; //clear an array
        }
    });

    readLine.on('close', function () {
        saveItems(items);
        items.length = 0;
    });
})(items,readLine)
});

saveItems function code:

function saveItems(items){

schema.items.collection.insert(items, function (err) {
    if(err)
        console.log(err);
})

}

For big files(about 300mbs) this code crashes with process out of memory, despite the fact that items array is cleared. Can anyone explain me why?

Upvotes: 0

Views: 85

Answers (1)

akras14
akras14

Reputation: 96

It's hard to tell. You can try taking a snap shot of your memory every few thousand lines, and see why the memory is growing, more info here: http://www.alexkras.com/simple-guide-to-finding-a-javascript-memory-leak-in-node-js/

Most likely you just cannot store all that data in memory(the large file that you are reading in). Most likely you will have to break it up in more manageable chunks (i.e. 1000 lines at a time), and insert those smaller chunks and free up the memory used. Unfortunately, I am not sure how to do that in MangoDB, so you will have to figure it out.

Upvotes: 1

Related Questions