user971956
user971956

Reputation: 3208

How to queue http get requests in Nodejs in order to control their rate?

I have a NodeJS app which sends HTTP get requests from various places in the code, some are even dependent (sending a request, waiting for a reply, processing it and based on results sending another request). I need to limit the rate the requests (e.g., 10 requests per hour).

I thought about queuing the requests and then at some central point releasing them in a controlled manner, but got stuck at how to queue the callback functions and their dependent parameters.

Would be happy to hear suggestions how to over come this scenario with minimum restructuring for the app.

Thanks

Upvotes: 14

Views: 19840

Answers (4)

JRichardsz
JRichardsz

Reputation: 16495

You can use this library

Without express-queue

By default the express processing is in parallel

var express = require('express');
var app = express();

app.get('/endpoint', function(req, res) {
  console.log(new Date())
  setTimeout(function () {
    res.type('text/plain');
    res.send('Hell , its about time!!');    
  }, 1000)  
});

app.listen(process.env.PORT || 8080);

Log

enter image description here

With express-queue

var express = require('express');
var queue = require('express-queue');
var app = express();
app.use(queue({ activeLimit: 1, queuedLimit: -1 }));

app.get('/endpoint', function(req, res) {
  console.log(new Date())
  setTimeout(function () {
    res.type('text/plain');
    res.send('Hell , its about time!!');    
  }, 1000)  
});

app.listen(process.env.PORT || 8080);

Log

enter image description here

As you can see the processing is sequential.

Note!!!

  • This works sacrificing hardware (ram). More incoming requests will need more ram to queue them.
  • I advice to use some queue platform like Activemq, RabbitMq, or proprietaries from IBM, Oracle, etc and a queue listener in nodejs

Upvotes: 0

lanzz
lanzz

Reputation: 43158

I would use Deferreds and return one for every queued request. You can then add succeed/fail callbacks to the deferred promise after it has been queued.

var deferred = queue.add('http://example.com/something');
deferred.fail(function(error) { /* handle failure */ });
deferred.done(function(response) { /* handle response */ });

You can hold a [ url, deferred ] pairs in your queue, and each time you dequeue a URL you'll also have the Deferred that goes with it, which you can resolve or fail after you process the request.

Upvotes: 0

Wes Johnson
Wes Johnson

Reputation: 3101

The Async module has a number of control flow options that could help you. queue sounds like a good fit, where you can limit concurrency.

Upvotes: 2

topek
topek

Reputation: 18979

I think that you have answered your question already. A central queue that can throttle your requests is the way to go. The only problem here is that the queue has to have the full information of for the request and the callback(s) that should be used. I would abstract this in a QueueableRequest object that could look something like this:

var QueueableRequest = function(url, params, httpMethod, success, failure){
  this.url = url;
  this.params = params;
  ...

}
//Then you can queue your request with

queue.add(new QueueableRequest({
  "api.test.com",
  {"test": 1},
  "GET",
  function(data){ console.log('success');},
  function(err){ console.log('error');}
}));

Of course this is just sample code that could be much prettier, but I hope you get the picture.

Upvotes: 8

Related Questions