SzymonPoltorak
SzymonPoltorak

Reputation: 616

nodejs multiple http requests in loop

I'm trying to make simple feed reader in node and I'm facing a problem with multiple requests in node.js. For example, I got table with urls something like:

urls = [
"http://url1.com/rss.xml",
"http://url2.com",
"http://url3.com"];

Now I want to get contents of each url. First idea was to use for(var i in urls) but it's not good idea. the best option would be to do it asynchronously but I don't know how to make it.

Any ideas?

EDIT:

I got this code:

var data = [];
for(var i = 0; i<urls.length; i++){
    http.get(urls[i], function(response){
    console.log('Reponse: ', response.statusCode, ' from url: ', urls[i]);
    var body = '';
    response.on('data', function(chunk){
        body += chunk;
    });

    response.on('end', function() {
        data.push(body);
    });
}).on('error', function(e){
    console.log('Error: ', e.message);
});
}

Problem is that first is call line "http.get..." for each element in loop and after that event response.on('data') is called and after that response.on('end'). It makes mess and I don't know how to handle this.

Upvotes: 44

Views: 108617

Answers (6)

KRISH C
KRISH C

Reputation: 31

Promise.allSettled will not stop at error. It make sure you process all responses, even if some have an error.

Promise.allSettled(promises)
 .then((data) => {
// do your stuff here
 })
 .catch((err) => {
   console.log(JSON.stringify(err, null, 4));
 });

Upvotes: 1

ANUPAM CHAUDHARY
ANUPAM CHAUDHARY

Reputation: 151

The problem can be easily solved using closure. Make a function to handle the request and call that function in the loop. Every time the function would be called, it would have it's own lexical scope and using closure, it would be able to retain the address of the URL even if the loop ends. And even is the response is in streams, closure would handle that stuff too.

const request = require("request");

function getTheUrl(data) {
    var options = {
        url: "https://jsonplaceholder.typicode.com/posts/" + data
    }
    return options
}

function consoleTheResult(url) {
    request(url, function (err, res, body) {
        console.log(url);
    });
}

for (var i = 0; i < 10; i++) {
    consoleTheResult(getTheUrl(i))
}

Upvotes: 0

werne2j
werne2j

Reputation: 583

I know this is an old question, but I think a better solution would be to use JavaScripts Promise.all():

const request = require('request-promise');
const urls = ["http://www.google.com", "http://www.example.com"];
const promises = urls.map(url => request(url));
Promise.all(promises).then((data) => {
    // data = [promise1,promise2]
});

Upvotes: 55

sait cihangir Aldemir
sait cihangir Aldemir

Reputation: 11

You can use any promise library with ".all" implementation. I use RSVP library, Its simple enough.

var downloadFileList = [url:'http://stuff',dataname:'filename to download']
var ddownload = downloadFileList.map(function(id){
          var dataname = id.dataname;
          var url = id.url;
          return new RSVP.Promise(function(fulfill, reject) {
           var stream = fs.createWriteStream(dataname);
            stream.on('close', function() {
            console.log(dataname+' downloaded');
            fulfill();  
            });
          request(url).on('error', function(err) {
    console.log(err);
    reject();
  }).pipe(stream);
        });
        });      
        return new RSVP.hashSettled(ddownload);

Upvotes: 1

toasted_flakes
toasted_flakes

Reputation: 2509

By default node http requests are asynchronous. You can start them sequentially in your code and call a function that'll start when all requests are done. You can either do it by hand (count the finished vs started request) or use async.js

This is the no-dependency way (error checking omitted):

var http = require('http');    
var urls = ["http://www.google.com", "http://www.example.com"];
var responses = [];
var completed_requests = 0;

for (i in urls) {
    http.get(urls[i], function(res) {
        responses.push(res);
        completed_requests++;
        if (completed_requests == urls.length) {
            // All download done, process responses array
            console.log(responses);
        }
    });
}

Upvotes: 47

Adrian
Adrian

Reputation: 9590

You need to check that on end (data complete event) has been called the exact number of requests... Here's a working example:

var http = require('http');
var urls = ['http://adrianmejia.com/atom.xml', 'http://twitrss.me/twitter_user_to_rss/?user=amejiarosario'];
var completed_requests = 0;

urls.forEach(function(url) {
  var responses = [];
  http.get(url, function(res) {
    res.on('data', function(chunk){
      responses.push(chunk);
    });

    res.on('end', function(){
      if (completed_requests++ == urls.length - 1) {
        // All downloads are completed
        console.log('body:', responses.join());
      }      
    });
  });
})

Upvotes: 28

Related Questions