hg_git
hg_git

Reputation: 3084

read file stream line by line synchronously

I'm looking at nodejs readline module documentation for a task where I've to read a very large file line by line and it looks good. But for my particular task, I need it to read lines synchronously ie. no matter what, line 5 must not be read before line 4, and due to nature of node, I just want to confirm that is this code safe for that usage -

const readline = require('readline');
const fs = require('fs');

const rl = readline.createInterface({
  input: fs.createReadStream('sample.txt')
});

rl.on('line', (line) => {
  console.log(`Line from file: ${line}`);
});

If not, what should I use/do? Currently it is working for me but I don't know if it'll work with large lines where next line could be parsed faster than previous one etc..

Upvotes: 1

Views: 1026

Answers (1)

Lazyexpert
Lazyexpert

Reputation: 3154

I doubt very much that it is possible, that the callback fired later can be executed earlier than another one. Basically, it refers to the event loop and stack of the process.

Still, to guarantee I can suggest to implement something similar to async/queue, but with ability to dynamically push callbacks.

Assuming you will have something like this:

const Queue = require('./my-queue')
const queue = new Queue()

function addLineToQueue(line) {
  queue.push(function() {
    // do some job with line
    console.log(`Line: "${line}" was successfully processed!`)
  })
}

You will modify your code:

rl.on('line', (line) => {
  addLineToQueue(line)
  console.log(`Added line to queue: ${line}`)
})

And sure your queue implementation should start as far as it has any tasks to execute. This way the order of callbacks will be guaranteed. But as for me it looks like a little overhead.

Upvotes: 1

Related Questions