Reputation: 2905
I'm building a simple HTTP server that serves a fairly large file to clients using streams. I need to serve multiple clients at the same time, and I'm wondering what the simplest way to achieve that is.
My initial feeling is that using the cluster
module and forking {num CPUs} processes might be the simplest way.
var StreamBrake = require('streambrake');
var http = require('http');
var fs = require('fs');
var server = http.createServer(function(req, res) {
var stream = fs.createReadStream('/data/somefile');
stream.pipe(new StreamBrake(10240)).pipe(res);
});
server.listen(1234, 10);
Edit: To clarify, the issue is that this code won't begin to serve the second client until it has finished serving the first.
Upvotes: 1
Views: 1388
Reputation: 18860
After thinking about your issue, I'm confused as to why you believe there's an issue. CreateReadStream is asynchronous. Here is in example that complicates the code a little bit in order to demonstrate that using CreateReadStream we can indeed service multiple connections at a time.
/*jshint node:true*/
var http = require('http');
var fs = require('fs');
var activeRequests = 0;
var launchFiveRequests = 5;
http.createServer(function (req, res) {
activeRequests++;
console.log('Request received');
var readStream = fs.createReadStream('play.js', {'bufferSize': 1024});
readStream.setEncoding('utf8');
readStream.on('data', function (data) {
console.log(activeRequests);
res.write(data);
});
readStream.on('end', function () {
res.end();
console.log('end');
activeRequests--;
});
}).listen(8080);
var options = {
hostname: 'localhost',
port: 8080,
path: '/'
};
while(launchTenRequests--) {
http.get(options, function(res) {
console.log('response received');
});
}
Serving up a sufficiently large file, and you should see that all 5 requests are live at once, and all end up ending relatively simultaneously.
Upvotes: 1