Reputation: 10257
In Python, we can read in a file line by line in a very neat way:
with open("filename") as fp:
for line in fp:
#handle your line
Just curious whether there is a similar way in NodeJS/Javascript to achieve this.
The closest way in NodeJS/Javascript I know is:
var fs = require('fs');
var readline = require('readline');
var stream = require('stream');
var instream = fs.createReadStream("filename");
var outstream = new stream;
var rl = readline.createInterface(instream, outstream);
rl.on('line', function(line) {
// handle your line here
});
Thanks
Derek
Upvotes: 3
Views: 1425
Reputation: 29
You can use my line-reader library if you prefer: https://github.com/bilaloguz/secureWebServer/blob/master/line_reader.js
In my experiments, for 1 million line text file, reading and writing to the console line by line took 218 seconds with Python, 111 seconds with Nodejs (Ubuntu 16.04).
Upvotes: 0
Reputation: 7343
I would recommend using line-by-line
npm
It is useful for reading large files as it does not buffer the file data.
It pauses the stream when it receives the data chunk, emits event for all lines in current chunk, retains the leftover portion of last line. Then it resumes back the stream and merges the leftover portion with new chunk and repeats the process.
You can check it source code from here
Below, is code snippet for example:
var LineByLineReader = require('line-by-line'),
lr = new LineByLineReader('big_file.txt');
lr.on('error', function (err) {
// 'err' contains error object
});
lr.on('line', function (line) {
// pause emitting of lines...
lr.pause();
// ...do your asynchronous line processing..
setTimeout(function () {
// ...and continue emitting lines.
lr.resume();
}, 100);
});
lr.on('end', function () {
// All lines are read, file is closed now.
});
Upvotes: 2