Reputation:
I have code that is structured roughly as follows:
readLine.on('line', function (line) {
async.parallel([
function (callback) {
// HTTP request #1 sent with accompanied JSON, get JSON as result
},
function (callback) {
// HTTP request #2 sent with accompanied JSON, get JSON as result
}],
function (error, results) {
// write to local file
})
I implemented the input stream with line-by-line
module.
My problem is that once my code has begin to run for 30 seconds or so, I get ECONNREFUSED
and ECONNRESET
errors, which I assume is because all my TCP connections are overloaded (all the HTTP requests go to a localhost server). It definitely doesn't help that the text file I have to read has about 200,000 lines.
Is there any way to process the lines and HTTP requests in batches and wait till all the requests have gone through and have returned successfully, so I don't overload my TCP sockets?
Thanks in advance for any help and advice.
Upvotes: 3
Views: 1838
Reputation: 203231
An option would be to use async.queue
, which has a configurable concurrency.
var queue = async.queue(function(task, queueCallback) {
async.parallel([
function (callback) {
// HTTP request #1 sent with accompanied JSON, get JSON as result
},
function (callback) {
// HTTP request #2 sent with accompanied JSON, get JSON as result
}],
function (error, results) {
// write to local file
...
// call the queue callback
queueCallback(error);
})
}, 10); // 10 concurrent 'tasks' max
readLine.on('line', function (line) {
queue.push({ line : line });
});
Upvotes: 1