Reputation: 173
I'm trying to stream an arbitrarily large amount of data over HTTP from a subprocess using node. My full code is here, and the salient bit is:
res.writeHead(200, {
'Content-Type': 'application/octet-stream',
'Content-Disposition': 'attachment;filename=osm_export_' +
((north + south) / 2) + '_' + ((east + west) / 2) + '.pbf'
});
// run vex
var proc = spawn(cmd, [dbname, south, west, north, east, '-']);
// stream chunks
proc.stdout.pipe(res);
After approximately 40 MB (anywhere between 40,000,000 and 42,000,000 bytes) the stream is interrupted and the request never completes. I can't set a Content-Length
header because I don't know how much data the command will produce until it is done I'm wondering if this is a buffer underrun; the command in question is extracting data from a database and writing a stream, which could possibly be slower than the connection between my computer and my server. I suspect this because I replaced the code with this:
var http = require('http');
var spawn = require('child_process').spawn;
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'application/octet-stream'});
var proc = spawn('head', ['-c', '500000000', '/dev/zero']);
proc.stdout.pipe(res);
}).listen(8181, '127.0.0.1');
which streams 500MB of null data, and it worked fine. Is there some sort of timeout, etc. that I need to set?
Upvotes: 2
Views: 1396
Reputation: 173
Ah, found the issue. This may or may not be useful to others. I was not doing anything with the process stderr
stream, and the backend process writes a lot of status information to stderr. A buffer inside node somewhere was getting filled up and then it was crashing. Changing the process creation line to
var proc = spawn(cmd, [dbname, south, west, north, east, '-'], {stdio: ['ignore', 'pipe', 'ignore']});
solved the problem. Thanks to all who helped!
Upvotes: 2
Reputation: 1420
Try to set up the HTTP header Content-Length
and set its value to the size of the data you use in the response.
Upvotes: 0