Joe Huang
Joe Huang

Reputation: 6570

How to arrange hundreds of network/database calls concurrently with Node.js & mongoDB?

Let's say my server app has 1000 records, and for each record, I have to do two things:

1. Make a Facebook Graph API inquiry
2. Save the inquiry result into mongoDB

Now I have those 1000 records ready, can I just iterate through these 1000 records and make 1000 Facebook API calls, and when a inquiry result comes back, I will save the result into the mongoDB in the network callback function?

Do I need to do any arrangement for this? For example, should I limit the total concurrent network requests or database requests? can I fire all requests without any timeout? What's the common way/framework to handle this kind situation when you have hundreds or thousands of network requests to be made?

Upvotes: 0

Views: 116

Answers (1)

Poorna Subhash
Poorna Subhash

Reputation: 2128

In nodejs, you can use async library and queue function, refer here

Please refer outline of sample code below.

async = require('async');

var invokeFB, invokeDB, fbqueue, dbqueue;

invokeFB = function (input, cb) {
  // DO watever you need and get fb response
  var fbresponse;
  //push the fb response for further processing.
  dbqueue.push(fbresponse);
  return cb(null);
};

invokeDB = function (fbresponse, cb) {
  //insert to db;
  return cb(null);
};

fbqueue = async.queue(invokeFB, 10);
dbqueue = async.queue(invokeDB, 10);

var records; // GET whatever input records are
records.forEach(function (input) {
  fbqueue.push(input);
});

Note: In above sample I have used two diffent queue functions, as concurrency required for Facebook and Mongodb may be different.

Upvotes: 2

Related Questions