Reputation: 11409
In other programs I've written, I've enjoyed the asynchronous aspects of node.js using promises.
I would like to employ this same programming style (using node.js) for Linux scripting. In other words, I'd like the ability to simultaneously execute multiple Linux commands, and then after those commands are complete, I want the node.js script to then execute another grouping of commands asynchronously and so on (without blocking).
I came across an aritlce, that shows how to perform synchronous Linux command using node.js, but I have yet to find a similar tutorial that covers the management of multiple asynchronous Linux command using node.js.
Is this currently possible? If so, could you direct me to some specific resources that could help me get started with this goal?
Upvotes: 2
Views: 3517
Reputation: 10151
I'm not sure if I'm right, but I think you are looking for exec
and spawn
. Please see the related API documentation. There are examples for both command in the documentation.
exec
and spawn
exec
is the "simple" version of spawn
. The former is using a single callback to report back to the user when the command is completed, and only when it's completely finished/failed.
child = exec('cat *.js bad_file | wc -l',
function (error, stdout, stderr) {
console.log('stdout: ' + stdout);
console.log('stderr: ' + stderr);
if (error !== null) {
console.log('exec error: ' + error);
}
});
So basically the supplied callback is only called, when everything that was written to stdout/stderr is available, completely. After the process terminates (with success or failure) only then the callback is called and users (you) can act upon it. If it failed error
is truthy.
spawn
is different, because you can listen on stdout/stderr events. First you "spawn" a process. The spawn
function returns a reference to the child process.
var spawn = require('child_process').spawn;
var ls = spawn('ls', ['-lh', '/usr']);
ls
here is the child process you've spawned. It has two properties (of importance right now), stdout
and stderr
which are event emitters. They're emitting the data
event. When stuff it's written on either streams the callbacks registered on the data
event is called.
ls.stdout.on('data', function (data) {
console.log('stdout: ' + data);
});
ls.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
There are other important events of course (check the documentation for the most up-to-date and relevant information of course).
ls.on('close', function (code) {
console.log('child process exited with code ' + code);
});
You would use spawn
when you want to capture stuff on stdout for example while the process is running. A good example would be if you were spawning an ffmpeg encoding task which takes minutes to finish. You could listen on stderr (because ffmpeg writes progress information to stderr instead of stdout) to parse "progress" information.
carrier
There is a nice additional library you can use together with spawn
. It's called carrier. It helps reading "lines" from the stdout/stderr of spawned processes. It's useful, because the data
parameter passed to the callbacks doesn't necessarily contain "complete" lines separated by \n
. carrier
helps with that. (However, it won't help you to capture ffmpeg's progress on stderr, because there are no newlines written by ffmpeg in this case, it's only a carriage return, the line is always rewritten basically.)
You would use it like this
var carry = require("carrier").carry;
var child = spawn("command");
carry(child.stdout, function(line) {
console.log("stdout", line);
});
If you would like to use a promise/deferred style approach then you could do something like the following using Q
which is used by AngularJS - or at least something very similar (see the link for a full tutorial on promises).
spawn
returns an Emitter
object which is not a promise. So you have to wrap the call to spawn (see Using Deferreds).
var q = require("q");
var spawn = require("child_process").spawn;
var ls = function() {
var deferred = q.defer();
var ls = spawn("ls", ["-lh", "/usr"]);
ls.stdout.on("data", function(data) {
deferred.notify({stdout: true, data: data});
});
ls.stderr.on("data", function(data) {
deferred.notify({stderr: true, data: data});
});
ls.on("close", function(code) {
if (code === 0) {
deferred.resolve();
} else {
deferred.reject(code);
}
});
return deferred.promise;
};
By executing ls()
now a promise is returned which you would use like any other promise. When it'll get resolved completely the first callback is called. If an error occurs (the process exist with a non zero exit code) the error handler is called. While the command progresses the third callback will be called (notify callback).
ls().then(function() {
console.log("child process exited successfully");
}, function(err) {
console.log("child process exited with code " + err);
}, function(args) {
if (args.stdout) {
console.log("stdout: " + args.data);
} else {
console.log("stderr: " + args.data);
}
});
When something gets written to stderr you could call reject immediately however, that is a design decision. Going back to the ffmpeg example this wouldn't do you any good, because ffmpeg spits general information to stderr. However it could work with other commands.
I think you'll get it :)
Examples are taken from nodejs's documentation, because they are well understood.
Upvotes: 9
Reputation: 9437
I recommend an approach combining generators and promises. Here are the prerequisites:
--harmony
flag, or any version of io.js (no flags required).Then to do the sort of thing you want:
var co = require('co');
var adapt = require('ugly-adapter');
var childProcessExec = require('child_process').exec;
var exec = adapt.part(childProcessExec);
co(function*() {
// run commands in parallel by yielding
// a single object or array
var result = yield {
listing: exec('ls -l'),
mkdirResult: exec('mkdir foo'),
blah: exec('echo "blah blah"'),
};
console.log(result.blah); // "blah blah"
// run commands in series by yielding
// one thing at a time
var listing = yield exec('ls -l');
var mkdirResult = yield exec('mkdir foo2');
var blah = yield exec('echo "blah blah"');
console.log(blah); // "blah blah"
}).catch(function(err){
// this handles any possible errors
// thrown by the above
console.error(err.stack);
process.exit(1);
});
The yield
keyword causes the function to pause, while co
unwraps the promise and sends the result back into your function.
Note: The library co
is really just a temporary stand-in for async/await which is coming in es7 and works essentially the same way.
Upvotes: 2
Reputation: 23070
In the core node libraries, synchronous functions have Sync
appended to the function name. They are using child_process.execFileSync, so you should be looking for child_process.execFile for the async version of the function.
Upvotes: 2