aRtoo
aRtoo

Reputation: 1892

How to get data after loop using promise

I am working on an async problem. I'm making a web scraper and after I scrape the web, I need to put the data in my MongoDB database after putting it in. I need to send it into the frontend, but since I have a loop the elements I can't put the res.json() inside, as it'll gave an error (you can only send once after res.json()).

I'm stuck here. I've used promises before, but this is confusing.

router.get('/scrape', (req, res) => {
  request('http://www.nytimes.com', function test(error, response, html) {
    const $ = cheerio.load(html);

    // An empty array to save the data that we'll scrape
    const results = [];

    $('h2.story-heading, p.summary').each(function(i, element) {
      const link = $(element)
        .children()
        .attr('href');
      const title = $(element)
        .children()
        .text();
      const summary = $(element)
        .children()
        .text();

      const data = {
        title: title,
        link: link,
        summary: summary,
      };

      articles
        .create(data)
        .then((resp) => results.push(resp))
        // .then((resp) => Promise.resolve(results)) //
        // .then((jsonDta ) => res.json(jsonData)) // error you can only give response once.
        .catch((err) => reject(err));
    });
    console.log(results); // empty array
    res.json(results)// empty 
  });
});

My plan is:

I need to put the query method create... inside the loop because I need each data to have an id.

Upvotes: 1

Views: 826

Answers (3)

McRist
McRist

Reputation: 1748

Use .map function to return all promises to Promise.all and then return the results.

      request('http://www.nytimes.com', function test(error, response, html) {
        const $ = cheerio.load(html);

        var summary = $('h2.story-heading, p.summary')
        Promise.all(summary.map((i, element) =>{
            const data = {
              title: $(element).children().text(),
              link: $(element).children().attr('href'),
              summary: $(element).children().text(),
            };

           return articles
            .create(data)

        }).get())
        .then((result)=>{
        console.log(result);
        res.json(result);
        });
    })

Upvotes: 1

Bergur
Bergur

Reputation: 4057

Something like this might work (code not tested)

router.get('/scrape', (req, res) => {
  request('http://www.nytimes.com', function test(error, response, html) {
    const $ = cheerio.load(html);

    // An empty array to save the data that we'll scrape
    const results = [];

    $('h2.story-heading, p.summary').each(function(i, element) {
      const link = $(element)
        .children()
        .attr('href');
      const title = $(element)
        .children()
        .text();
      const summary = $(element)
        .children()
        .text();

      const data = {
        title: title,
        link: link,
        summary: summary,
      };

      const articleCreate = articles.create(data); 
      results.push(articleCreate);

    });

    console.log(results); // this is array of promise functions.

    Promise.all(results).then(allResults => {
      res.json(allResults)
    });

    // or you could use array.reduce for sequantial resolve instead of Promise.all
  });
});

Upvotes: 1

Roamer-1888
Roamer-1888

Reputation: 19288

Instead of trying to accumulate results directly, you can map the elements contained in $('h2.story-heading, p.summary') to an array of promises, then aggregate with Promise.all(). The results you want will be delivered by Promise.all(...).then(...).

router.get('/scrape', (req, res) => {
    request('http://www.nytimes.com', function test(error, response, html) {
        const $ = cheerio.load(html);
        const promises = $('h2.story-heading, p.summary')
        .get() // as in jQuery, .get() unwraps Cheerio and returns Array
        .map(function(element) { // this is Array.prototype.map()
            return articles.create({
                'title': $(element).children().text(),
                'link': $(element).children().attr('href'),
                'summary': $(element).children().text()
            })
            .catch(err => { // catch so any one failure doesn't scupper the whole scrape.
                return {}; // on failure of articles.create(), inject some kind of default object (or string or whatever).
            });
        });
        // At this point, you have an array of promises, which need to be aggregated with Promise.all().
        Promise.all(promises)
        .then(results => { // Promise.all() should accept whatever promises are returned by articles.create().
            console.log(results);
            res.json(results);
        });
    });
});

If you want any single failure to scupper the whole scrape, then omit the catch() and add catch() to the Promise.all().then() chain.

Notes:

  1. For .get() (and most other methods), the jQuery documentation is better than the Cheerio documentation (but be careful because Cheerio is a lean version of jQuery).

  2. At no point do you need new Promise(). All the promises you need are returned by articles.create().

Upvotes: 2

Related Questions