Reputation: 990
I have a node express/ socket application where my express server makes several API calls with node-rest-client looping through elements in var jobs and when each finishes, it sends the data via socket io to the client. However, every now and then i get a socket hang error after about 1000 or so API calls.
events.js:182
throw er; // Unhandled 'error' event
^
Error: socket hang up
at createHangUpError (_http_client.js:345:15)
at Socket.socketOnEnd (_http_client.js:437:23)
at emitNone (events.js:110:20)
at Socket.emit (events.js:207:7)
at endReadableNT (_stream_readable.js:1059:12)
at _combinedTickCallback (internal/process/next_tick.js:138:11)
at process._tickCallback (internal/process/next_tick.js:180:9)
How do you error handle these errors? Or perhaps my initial attempt to code this function is poor, and in which case, any suggestion on how to make multiple API calls and emit the results to all sockets connect? (Requirement, the only way i can get the information is through making these API calls).
Server:
setInterval(function(){
var jobs = ['J1', 'J2', 'J3', 'J4'];
var full_data = {};
for(var i = 0; i < jobs.length; i++){
client.get("MY URL", function (data, response) {
io.sockets.emit('progressbar', data);
});
}
console.log(full_data);
}, 5000)
Where, 'progressbar' is the client function listening for data.
Upvotes: 0
Views: 1350
Reputation: 707158
If your jobs
array gets larger, then you may just have too many requests in flight at the same time. It could be:
I'd suggest the following solution to handle all those issues:
const Promise = require('bluebird');
const utils = require('utils');
client.getAsync = utils.promisify(client.get);
function runJobs() {
var jobs = ['J1', 'J2', 'J3', 'J4'];
var full_data = {};
Promise.map(jobs, function(job) {
return client.getAsync("MY URL").then(data => {
io.emit('progressbar', data);
}).catch(err => {
console.log('something went wrong on the request', err.request.options);
// eat the error on purpose to keep going
});
}, {concurrency: 5}).then(() => {
// All done, process all final data here
// Then, schedule the next iteration
setTimeout(runJobs, 5000);
});
}
runJobs();
This runs a max of 5 requests at a time (you can play with adjusting that number) which solves both items 1 and 2 above. And, instead of setInterval()
, it uses a recurring setTimeout()
so that it won't ever schedule the next iteration until the prior one is done (even if the target server gets really slow).
Upvotes: 1
Reputation: 990
It turned out to be the client.get() request causing the error. Here is my code to fix this. It still errors, but at least the error is handled and wont cause the node server to crash. If there is a more eloquent way of handling this, please let me know!
setInterval(function(){
var jobs = ['J1', 'J2', 'J3', 'J4'];
var full_data = {};
for(var i = 0; i < jobs.length; i++){
client.get("MY URL", function (data, response) {
io.sockets.emit('progressbar', data);
}).on('error', function (err) {
console.log('something went wrong on the request', err.request.options);
});
}
console.log(full_data);
}, 5000)
Upvotes: 0