Reputation: 53
I am trying to put objects to array based on txt file that has around 500.000 lines or more
I am using require('readline') to handle it, but the processing "pause" for yourself when achieve line 470000(e.g) without errors, warnings, notices...
this is examplo of my code ( the original code fill the dataRow object then it "pauses" when achieve line 411000):
let myList = [];
let lineReader = require('readline').createInterface({
input: require('fs').createReadStream(filePath).pipe(iconv.decodeStream('latin1'))
});
lineReader.on('line', function (line) {
// here there are a lot more fields, but I have to cut off for this example
let dataRow = JSON.parse('{"Agencia_Cobranca":"","Aliquota_ICMS":""}');
myList.push(dataRow);
//this if is only to follow what is happen
if( myList.length %10000 == 0 || myList.length>420000) {
console.log(" myList executed: ",myList.length, ' - ', JSON.stringify( myList[myList.length-1] ).length, ' - ' ,new Date() );
}
}).on('close',function(){
console.log('finished');
process.exit(0);
});
I am using this command line to execute
node --max-old-space-size=8192 teste
Welll... this is the result, the screen just stay this way when achieve this line... never ends and without errors :(
Upvotes: 2
Views: 1230
Reputation: 53
In NodeJs (javascript too) maximum size of an array object is 2^32 -1. Just try to execute this in a nodejs application
console.log(new Array(4294967295))
try {
console.log(new Array(4294967296))
} catch(err){
console.log(err);
}
Upvotes: 1
Reputation: 25
Consider using database if you work with that much data. Store it in a table then query the data you need to work with would be more efficient.
Upvotes: 0
Reputation: 1797
Your stack/Ram is probably full and erroring out in a weird way. I would recommend if at all possible to make your program more memory efficient, do everything you need to do with a line as you read it and then discard it. Storing it all in memory is never going to be a solution.
Upvotes: 1