Reputation: 60
i'm using nodejs to capture a video stream and then calling ffmpeg using spawn to process the feed. I would like to do this in parallel if i receive multiple video streams.
I can mimic this manually by opening multiple terminals and using a different variation of the ffmpeg command to execute the processes.
I get that nodejs is single threaded, I have reviewed async but not figured out how this may apply given that callbacks play an integral role.
Essentially I want to call multiple ffmpeg processes in parallel without opening several terminal windows and handle callbacks such as errors or exit.
e.g.
//spawn ffmpeg with params
var ffmpegexec = spawn('ffmpeg', ['-i', 'pathname', '-vcodec', 'libx264', '-acodec', 'libfdk_aac', '-threads', '2', '-s', '320x240', 'filename.mp4');
//deal with output
ffmpegexec.stdout.on('data', function(data) {
console.log("stdout: " + data);
});
ffmpegexec.stderr.on('data', function(data) {
console.log("stderr : " + data);
});
ffmpegexec.on('exit', function(code) {
console.log("exit: " + code);
});
I was thinking that perhaps i could launch each new process in a separate nodejs instance, or not depending on your recommendations
Upvotes: 3
Views: 7121
Reputation: 156364
spawn(...)
multiple times consecutively.Each call to process.spawn(...)
starts a new OS process, a large number of which can be run concurrently. You will have to register separate stdout/stderr/exit handlers for each instance but if they are similar you can use the same (or similar) handlers.
For example, if you wanted to run the same commands on separate filenames you could do the following:
function runFfmpeg(filename) {
var proc = process.spawn('ffmpeg', ['-i', 'pathname', ..., filename]);
proc.stdout.on('data', function(data) { console.log("stdout: " + data); });
proc.stderr.on('data', function(data) { console.log("stderr: " + data); });
proc.on('exit', function(code) { console.log("exit: " + code); });
}
var filenames = ['foo.mp4', 'bar.mp4', 'gah.mp4'];
filenames.forEach(function(filename) { runFfmpeg(filename); });
Keep in mind that this could incur a huge load on your machine. You will likely need to throttle the number of threads/processes running at any time based on the resources available on the target physical machine (e.g. one process per physical CPU and one or two threads per CPU core).
Upvotes: 6
Reputation: 145984
This will happen naturally in node's normal code path. If you had 2 files like: ['video1.mp4', 'video2.mp4']
and looped over that array and passed each filename to your function that spawns ffmpeg
, they would run in parallel. Fundamentally this is straightforward to achieve in node.
However, it is naive to do this within a network service (your question isn't clear as to whether this has a web interface or is a CLI or what), because it is then trivial to DoS the system to a halt by launching an untenable number of these in parallel. Here is where async.eachLimit
comes to the rescue with a sensible queue/batch paradigm.
Upvotes: 3