Reputation: 877
I'm working on a nodejs application and I need to pipe a multi-line string into a shell command. I'm not a pro at shell scripting but if I run this command in my terminal it works just fine:
$((cat $filePath) | dayone new)
Here's what I've got for the nodejs side. The dayone command does work but there is nothing piped into it.
const cp = require('child_process');
const terminal = cp.spawn('bash');
var multiLineVariable = 'Multi\nline\nstring';
terminal.stdin.write('mul');
cp.exec('dayone new', (error, stdout, stderr) => {
console.log(error, stdout, stderr);
});
terminal.stdin.end();
Thanks for any help!
Upvotes: 6
Views: 2854
Reputation: 79
With Readable Streams it's really easy to listen to the input
const chunks = [];
process.stdin.on('readable', () => {
const chunk = process.stdin.read()
chunks.push(chunk);
if (chunk !== null) {
const result = Buffer.concat(chunks);
console.log(result.toString());
}
});
With Writable Streams you can write to the stdout
process.stdout.write('Multi\nline\nstring');
Hopefully I could help you
Upvotes: 2
Reputation: 302
Here, you're starting up bash using spawn, but then you're using exec to start your dayone program. They are separate child processes and aren't connected in any way.
'cp' is just a reference to the child_process module, and spawn and exec are just two different ways of starting child processes.
You could use bash and write your dayone command to stdin in order to invoke dayone (as your snippet seems to be trying to do), or you could just invoke dayone directly with exec (bear in mind exec still runs the command in a shell):
var multiLineVariable = 'Multi\nline\nstring';
// get the child_process module
const cp = require('child_process');
// open a child process
var process = cp.exec('dayone new', (error, stdout, stderr) => {
console.log(error, stdout, stderr);
});
// write your multiline variable to the child process
process.stdin.write(multiLineVariable);
process.stdin.end();
Upvotes: 5