Leonardo Rick
Leonardo Rick

Reputation: 788

"Stream" each new line of file using Node.js

I want to read all new lines added to a file, one by one, like it was a stream of data. So far I'm using chokidar to receive a notification on each update on that file, and then, I'm using read-last-lines passing number 1 to indicate that I want to read only one line.

import fs from 'fs';
const  readLastLines = require('read-last-lines');
import chokidar from 'chokidar';

const logFile = 'log.txt';
// first checks if its exist
if (fs.existsSync(logFile)) { 
  // then start watching on every update
  chokidar.watch(logFile).on('change', (_event, _path) => {
      // read each update
      readLastLines.read(logFile, 1)
      .then((line: string) => {
          // do something with this received line.
         // Can't receive two or three lines if sent instantaneously
          doSomething(line);
      }).catch((err :any) => {
        console.log(err)
      })
 })
} else {
  console.log('File Not Found')
}

I work's very well, except when I got two or three updates on the file instantaneously. In this case I can get only the last line of this sequence of updates. I think maybe chokidar has a delay on his operation or my OS is not allowing the file that is being written by another program to be read by this node program at the same time.

Any guess on how can I workaround that? I want it to run exactly if it was a program output (stdout) on terminal. (Acctualy, my porpouse is to use that to log, from client, a program that is running on my server).

Upvotes: 1

Views: 689

Answers (1)

Zuum Dj
Zuum Dj

Reputation: 96

May be a bit late. There is a library for node to work with log-like files on line by line basis: tail

It's underlying implementation uses node's fs library watch and watchFile to achieve this.

FYI: i'd use a separate file in your code to track how much lines you have processed to provide nLines option for crash recovery and consistency.

Upvotes: 0

Related Questions