Reputation: 1396
Using node child process exec
, I'm calling a ffmpeg conversion via a promise that takes a bit of time. Each time the use clicks "next" it starts the FFMpeg command on a new file:
function doFFMpeg(path){
return new Promise((resolve, reject) => {
exec('ffmpeg (long running command)', (error, stdout, stderr) => {
if (error) {
reject();
}
}).on('exit', (code) => { // Exit returns code 0 for good 1 bad
if (code) {
reject();
} else {
resolve();
}
});
});
}
The problem is, if the user moves on to the next video before the promise is returned, I need to scrap the process and move on to converting the next video.
How do I either:
A) (Ideally) Cancel the current promised exec process* B) Let the current promised exec process complete, but just ignore that promise while I start a new one.
*I realize that promise.cancel is not yet in ECMA, but I'd like to know of a workaround -- preferably without using a 3rd party module / library.
Attempt:
let myChildProcess;
function doFFMpeg(path){
myChildProcess.kill();
return new Promise((resolve, reject) => {
myChildProcess = exec('ffmpeg (long running command)', (error, stdout, stderr) => {
if (error) {
reject();
}
}).on('exit', (code) => { // Exit returns code 0 for good 1 bad
if (code) {
reject();
} else {
resolve();
}
});
});
}
Upvotes: 0
Views: 1706
Reputation: 19288
Assuming exec()
does indeed return an object with a .kill()
method, the attempt looks pretty close to what you want. You just have to accept promise rejection in lieu of cancellation, which is unavailable in native Promises. It is typically inconsequential, even better, to reject than to cancel.
As I understand it, killing the process will cause the callback to fire with an error, (and/or the 'exit' handler to fire with the error code). If so, you don't need to reject the promise explicitly when the process is killed - reject()
will be called anyway.
Your doFFMpeg()
just needs some safety around calling myChildProcess.kill()
.
Something like this should do it :
const doFFMpeg = (function() {
var myChildProcess = null;
return function (path) {
if (myChildProcess) {
myChildProcess.kill();
}
return new Promise((resolve, reject) => {
myChildProcess = exec('ffmpeg (long running command)', (error, stdout, stderr) => {
if (error) {
reject(error);
}
myChildProcess = null;
});
myChildProcess.on('exit', (code) => { // Exit returns code 0 for good 1 bad
if (code) {
reject(new Error('bad exit'));
} else {
resolve();
}
myChildProcess = null;
});
});
}
})();
I'm not sure that exec()
's callback is necessary (unless you need to process stdout
/stderr
or need to know details of the error
). It's possible that just the .on('exit')
handler will suffice, or maybe .on('end')
and .on('error')
handlers.
If the caller needs to handle "kill errors" differently from other errors, then there's a little more work to do. You will need to ensure that, on kill, the Promise is rejected with a detectable error (eg a custom error, or an Error monkeypatched with a custom property).
Upvotes: 1
Reputation: 15361
If I understand correctly you want to execute ffmpeg conversions in a chain, one after the other, and kill the active one upon moving to the next if the active one hasn't finished yet.
Assuming childprocess.exec() is used, you could keep track of the child processes in a global variable and when doFFMpeg() is invoked, it should kill() any still running before instantiating the new promise.
Upvotes: 1