Reputation: 2376
I am trying to make a Machine Learning Project(An Integer Sequence Predictor). ML code is a python script. I am trying to make a express route to handle the stdin and stdout of the python script and send response according to that.
I am spawning the python script as the app starts.
let p = spawn('python', ['python/script.py'], { stdio: ['pipe', 'pipe', process.stderr] });
My Route
router.get('/', managePredictor);
managePredictor
function managePredictor(req, res, next) {
try {
//timestamp
let ts = new Date().getTime();
let token = ts + '_' + magic();
let seq = req.query.seq;
let re = new RegExp(token + '((?:\\s|\\S)*)' + token);
p.stdin.write(token + '\n' + seq + '\n');
p.stdout.resume();
p.stdout.on('data', (chunk) => {
str += chunk;
let s = re.exec(str);
let o = JSON.parse(s[1].trim());
p.stdout.pause();
res.send(o);
});
}
catch (err) {
res.status(500).send('Internal Server Error');
}
}
token
is a string like 1520661269162_YGo6p3
.
For first request to the route but for any consectutive request the error is like
>Error: Can't set headers after they are sent.
> at validateHeader (_http_outgoing.js:494:11)
> at ServerResponse.setHeader (_http_outgoing.js:501:3)
> at ServerResponse.header (D:\mlproj\node_modules\express
>lib\response.js:730:10)
> at ServerResponse.send (D:\mlproj\node_modules\express >\lib\response.js:170:12)
> at ServerResponse.json (D:\mlproj\node_modules\express >\lib\response.js:256:15)
> at ServerResponse.send (D:\mlproj\node_modules\express >\lib\response.js:158:21)
at Socket.p.stdout.on (D:\mlproj\routes\predict.js:39:11)
at emitOne (events.js:121:20)
at Socket.emit (events.js:211:7)
at addChunk (_stream_readable.js:263:12)
Python Script's IO snippet
while True:
ts = input()
inp = input()
arr = inp.split(" ")
arr = list(map(lambda x: int(x), arr))
qq = findNextn(arr, 5)
print(ts)
out = json.dumps({'orignal': arr, 'predicted':qq}, indent=4)
print(out)
print(ts)
Upvotes: 0
Views: 63
Reputation: 907
You need to use node streams for this.
The following code should work.
function managePredictor(req, res, next) {
try {
//timestamp
let ts = new Date().getTime();
let token = ts + '_' + magic();
let seq = req.query.seq;
let re = new RegExp(token + '((?:\\s|\\S)*)' + token);
p.stdin.write(token + '\n' + seq + '\n');
p.stdout.on('open', () => {
res.pipe(p.stdout)
})
p.stdout.on('data', (chunk) => {
str += chunk;
let s = re.exec(str);
let o = JSON.parse(s[1].trim());
p.stdout.write(o);
});
p.stdout.on('end', (chunk) => {
res.end();
});
}
catch (err) {
res.status(500).send('Internal Server Error');
}
}
Upvotes: 1
Reputation: 943480
p.stdout.on('data', (chunk) => { ... res.send(o);
You try to send a response every time you get a chunk of data.
So the first time you get a chunk of data after the request arrives, you send a response.
Then the next time you get a chunk of data, you try to send the response again. You can't do that because it has already been sent.
You need to rethink your approach to how you handle the interaction between HTTP requests and the data from the Python program.
Possibly you should be storing the data in a string each time you get a chunk from the Python process, and then responding to HTTP requests with the current content of that string (while resetting it back to an empty one).
Upvotes: 1