Reputation: 73
I'm playing with node and (at the moment) simply trying to stream some files from the filesystem over HTTP.
The Apache Bench (ab) app on my OS X Lion machine is buggy and seems to abort connections prematurely. This seems to have highlighted an issue with my node.js app where it leaks filehandles if the connection gets aborted.
I've reduced this to a simple node.js test case. When I launch this and run 'ab' against it, I eventually get an EMFILE exception due to too many open file handles:
// app.js
var count = 0;
require('http').createServer(function(req,res){
console.log('request ' + ++count);
res.writeHead(200);
require('fs').createReadStream(
'/Users/tom/files/8339cdf73594d8f0aab87da123e9e0380723b471'
).pipe(res);
}).listen('3000');
$ node app.js
request 1
...
request 317
request 318
request 319
stream.js:105
throw er; // Unhandled stream error in pipe.
^
Error: EMFILE, too many open files '/Users/tom/files/8339cdf73594d8f0aab87da123e9e0380723b471'
I assume what is happening is the HTTP connection is aborted but node's .pipe() mechanism doesn't know to stop reading from the readable stream and close the FD, so I tried to handle the 'close' event on the HTTP ServerRequest and destroy the FD, but something seems to be still reading from it as I get a bad FD error (EBADF) instead:
// app.js
var count = 0;
require('http').createServer(function(req,res){
console.log('request ' + ++count);
res.writeHead(200);
var file = require('fs').createReadStream(
'/Users/tom/dev/cloudstore2/files/8339cdf73594d8f0aab87da123e9e0380723b471'
);
req.on('close', function(){
console.log('request received close - destroying read FD');
file.destroy();
});
file.pipe(res);
}).listen('3000');
$ node app.js
...
request 112
request 113
request received close - destroying read FD
request received close - destroying read FD
request received close - destroying read FD
request received close - destroying read FD
request received close - destroying read FD
request received close - destroying read FD
request received close - destroying read FD
events.js:48
throw arguments[1]; // Unhandled 'error' event
^
Error: EBADF, bad file descriptor
Is there a 'correct' way to handle these aborted HTTP connections and close out my associated fs read stream, or am I seeing a bug in node.js' .pipe() handling?
Update: By catching the 'error' event on my file reader, I can avoid the app dying during this process, but is that the correct way to do this? I feel like I'm missing some way to avoid throwing a bad FD error in the first place.
<snip>
var file = require('fs').createReadStream(
'/Users/tom/dev/cloudstore2/files/8339cdf73594d8f0aab87da123e9e0380723b471'
);
file.on('error', function(err){
console.log('error handled in file reader: ' + err);
});
req.on('close', function(){
console.log('request received close - destroying read FD');
file.destroy();
});
file.pipe(res);
</snip>
Upvotes: 4
Views: 3448
Reputation:
According to the nodejs documentation
http.ServerRequest
Event: 'close'
Indicates that the underlaying connection was terminated before response.end() was called or able to flush.
Just like 'end', this event occurs only once per request, and no more 'data' events will fire afterwards.
Note: 'close' can fire after 'end', but not vice versa.
I think that end() kills the destination fh of your pipe which means if end is fired first there is a small window where the pipe could be trying to write to the request fh after it's closed. Try putting your file.destroy() call in the end() event.
I tried to test my idea but I can't get the error so please let me know if that worked for you. I'm worried I will run into this problem when I open my app up to other users.
Upvotes: 2