Programmer
Programmer

Reputation: 8727

Capture continous output from a process

I have an expect script that does passwordless ssh to another box and starts a command to dump a file content using command : tail -f ...

Due to the fact that I am using -f tail option the commands waits until new data is added into the file and dumps it immediately.

I am using exec to start the script from my NodeJS script:

var child = exec('script.sh process1 process2', function(err, stdout, stderr) {
if(err)
{
   console.log("Error");
   return;
}
var result;
while((result = stdout.toString.split("\r\n")) != null)
{
  logger.info(result);
}
}
});

But I am getting below error in console logs:

Error in expect script ::Error: maxBuffer exceeded.

Since the output is a continuous data on stdout stream how can I avail the desired objective. I tried using spawn but I get below error:

NodeJS : warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit

How can I overcome this issue?

EDIT::

Seems like callback or events would be called when the command in concern has finished running or returned an error. Is there a way around I can get the output even though my command is still executing in background?

Upvotes: 1

Views: 862

Answers (1)

Amir T
Amir T

Reputation: 2758

Looks like spawn is the way to go as it's mean to handle streams, whereas exec has a max buffer of 200k.

Your spawn output is just a warning: possible EventEmitter memory leak detected

Upvotes: 2

Related Questions