Manel R. Doménech
Manel R. Doménech

Reputation: 173

Thousands of concurrent http requests in node

I have a list of thousands of URLs. I want to get a health check (healt.php) with an http request.

This is my problem:

I've wrote an application in node. It makes the requests in a pooled way. I use a variable to control how many concurrent connections I open. 300, ie. One by one, each request is so fast, no more than 500ms.

But when I run the application, the result is:

$ node agent.js

200ms   url1.tld
250ms   url4.tld
400ms   url2.tld
530ms   url8.tld
800ms   url3.tld
...
2300ms  urlN.tld
...
30120ms urlM.tld

It seems that there is a limit in concurrency. When I execute

$ ps axo nlwp,cmd | grep node

The result is:

6 node agent.js

There are 6 threads to manage all concurrent connections. I found an evn variable to control concurrency in node: UV_THREADPOOL_SIZE

$ UV_THREADPOOL_SIZE=300 node agent.js

200ms   url1.tld
210ms   url4.tld
220ms   url2.tld
240ms   url8.tld
400ms   url3.tld
...
800ms  urlN.tld
...
1010ms urlM.tld

The problem is still there, but the results are much better. With the ps command:

$ ps axo nlwp,cmd | grep node

132 node agent.js

Next step: Looking in the source code of node, I've found a constant in deps/uv/src/unix/threadpool.c:

#define MAX_THREADPOOL_SIZE 128

Ok. I've changed that value to 2048, compiled and installed node and run once the command

$ UV_THREADPOOL_SIZE=300 node agent.js

All seems ok. Response times are not incrementing gradually. But when I try with a bigger concurrency number the problema appears. But this time it's not related to the number of threads, because with the ps command I see there are enough of them.

I tried to write the same application in golang, but the results are the same. The time is increasing gradually.

So, my question is: Where is the concurrence limit? memory and cpu load and bandwith are not out of bounds. And I tuned sysctl.conf and limits.conf to avoid some limits (files, ports, memory, ...).

Upvotes: 3

Views: 1758

Answers (2)

chovy
chovy

Reputation: 75834

If you're using request or request-promise you can set the pool size:

  request({
    url: url,
    json: true,
    pool: {maxSockets: Infinity},
    timeout: 2000
  })

More info here: https://github.com/request/request

Upvotes: 0

Emmett
Emmett

Reputation: 14337

You may be throttled by http.globalAgent's maxSockets. Depending on whether you're using http or https, see if this fixes your problem:

require('http').globalAgent.maxSockets = Infinity;
require('https').globalAgent.maxSockets = Infinity;

Upvotes: 2

Related Questions