Reputation: 4861
I have a node.js script that does some logging to a file using WriteStream. On certain events I want to stop execution of the script, i.e. warn to log and exit immediately after that. Being asyncronious node.js does not allow us to do it straight forward like:
#!/usr/local/bin/node
var fs = require('fs');
var stream = fs.createWriteStream('delme.log', { flags: 'a' });
stream.write('Something bad happened\n');
process.exit(1);
Instead of appending a message to delme.log this script does nothing with the file. Handling 'exit' event and flushing doesn't work. The only way to write the last log message before exitting found so far is to wrap process.exit(1)
in the setTimeout()
:
#!/usr/local/bin/node
var fs = require('fs');
var stream = fs.createWriteStream('delme.log', { flags: 'a' });
stream.write('Something bad happened\n');
setTimeout(function(){
process.exit(1);
}, 30);
However in this form it doesn't stop the script execution immediately and the script will be running for some time after the critical event happened. So I'm wondering if there are other ways to exit a script with a log message?
Upvotes: 17
Views: 22666
Reputation: 13442
Improved.
var fs = require('fs');
var stream = fs.createWriteStream('delme.log', {flags: 'a'});
// Gracefully close log
process.on('uncaughtException', function () {
stream.write('\n'); // Make sure drain event will fire (queue may be empty!)
stream.on('drain', function() {
process.exit(1);
});
});
// Any code goes here...
stream.write('Something bad happened\n');
throw new Error(SOMETHING_BAD);
The try-catch block works but it is ugly. Still, credits go @nab, I just prettified it.
Upvotes: 5
Reputation: 4861
To flush all log messages to a file before exitting one might want to wrap a script execution in a try-catch block. Once something bad has happened, it's being logged and throws an exception that will be catched by the outer try
from which it is safe to exit asynchronously:
#!/usr/local/bin/node
var fs = require('fs');
var stream = fs.createWriteStream('delme.log', { flags: 'a' });
var SOMETHING_BAD = 'Die now';
try {
// Any code goes here...
if (somethingIsBad) {
stream.write('Something bad happened\n');
throw new Error(SOMETHING_BAD);
}
} catch (e) {
if (e.message === SOMETHING_BAD) {
stream.on('drain', function () {
process.exit(1);
});
} else {
throw e;
}
}
Upvotes: 3
Reputation: 18205
Since you want to block, and already are using a stream, you will probably want to handle the writing yourself.
var data = new Buffer('Something bad happened\n');
fs.writeSync(stream.fd, data, 0, data.length, stream.pos);
process.exit();
Upvotes: 14
Reputation: 5484
I think this is the right way:
process.on('exit', function (){
// You need to use a synchronous, blocking function, here.
// Not streams or even console.log, which are non-blocking.
console.error('Something bad happened\n');
});
Upvotes: 4
Reputation: 1401
I would advocate just writing to stderr in this event - e.g trivial example
console.error(util.inspect(exception));
and then let the supervising* process handle the log persistence. From my understanding nowadays you don't have to worry about stdout and stderr not flushing before node exits (although I did see the problematic opposite behavior in some of 0.2.x versions).
(*) For supervising process take your pick from supervisord, god, monit, forever, pswatch etc...
This also provides a clean path to use PaaS providers such as Heroku and dotcloud etc... let the infrastructure managing the logging
Upvotes: 2