Reputation: 100010
I am using several child_process's with a Node.js parent process and I am piping all the stderr from the child processes to a file.
Like so:
const strmPath = path.resolve(projRoot + '/suman/stdio-logs/runner-stderr.log');
const strm = fs.createWriteStream(strmPath);
//before I call pipe I write some stuff to the log file
strm.write('\n\n>>> Suman start >>>\n');
strm.write('Beginning of run at ' + Date.now() + ' = [' + new Date() + ']' + '\n');
strm.write('Command = ' + JSON.stringify(process.argv) + '\n');
// when the parent process exits, I want to write one more line to the log file
process.on('exit', function (code, signal) {
strm.write('<<<<<< Suman runner end <<<<<<\n');
});
in the parent process I pipe data to the file like so:
const n = cp.fork(path.resolve(__dirname + '/run-child.js'), argz, {silent: true});
n.on('message', function (msg) {
handleMessage(msg, n);
});
n.stdio[2].setEncoding('utf-8');
n.stdio[2].pipe(strm);
the problem is that the 'end' event fires on the stream before process.on('exit') occurs. In fact there seems to be no way to hook into the stream to write to it before 'end' is called; well, actually I am looking for some way to hook into the stream so that before end is called, I can write to it at least once.
Is there a way to do this?
(As an aside, I am also looking for a way to pipe child_process stderr stream to the file directly without it having to go through the parent process first. This I can probably do by simply calling process.stderr.pipe()
in each child_process
)
Upvotes: 2
Views: 464
Reputation: 17498
May be this might help:
Here auto closing of strm
is prevented when n.stdio[2]
closes. So you can end strm
stream manually with last piece of chunk.
n.stdio[2]
.on('end', function(){
strm.end('<<<<<< Suman runner end <<<<<<\n');
})
.pipe(strm, {end: false})
Upvotes: 1