Fra96
Fra96

Reputation: 55

Promise error for file with 'lots' of lines

I'm reading a csv file, every line contains a website url. I have this function for reading the file:

function readCSV(csv){

  var lines=csv.split("\n");

  var result = [];

  var headers=lines[0].split(",");

  // for every line of the file I call check_page function to check the policies (csp and xfo)
  Promise.all(
    lines.map(line => {
      var obj = {};
      var currentline=line.split(",");
      console.log("currentline: "+currentline[1])  
      return check_page("https://www."+currentline[1])
    })
  ).then(() => console.log('it worked')).catch(err => console.log(err));
}

This function calls another function inside to getting the csp and xfo header by http-request.

async function check_page(web_page){

    const browser = await puppeteer.launch();
    const page = await browser.newPage();
    await page.goto(web_page)
    
    
      
    console.log("MAIN: "+page.mainFrame().url())
    /* I  send for every iframe an http request for retrieve the policies from http header */
    var XMLHttpRequest = require("xmlhttprequest").XMLHttpRequest;       
    var req = new XMLHttpRequest();
    console.log("FACCIO LA GET: "+page.mainFrame().url())
    req.open('GET', page.mainFrame().url(), false);
    req.send(null)
    var headers = req.getAllResponseHeaders().toLowerCase();       
    var arr = headers.trim().split(/[\r\n]+/);
        // Create a map of header names to values
    var headerMap = {};
    arr.forEach(function (line) {
      var parts = line.split(': ');
      var header = parts.shift();
      var value = parts.join(': ');
      headerMap[header] = value;
    });
        
         

    await browser.close();
  
}

My code works if I have a small number of lines: but if I have a file with 100 lines I have this error:

(node:1076) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 exit listeners added to [process]. Use emitter.setMaxListeners() to increase limit
(node:1076) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 SIGINT listeners added to [process]. Use emitter.setMaxListeners() to increase limit
(node:1076) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 SIGTERM listeners added to [process]. Use emitter.setMaxListeners() to increase limit
(node:1076) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 SIGHUP listeners added to [process]. Use emitter.setMaxListeners() to increase limit

I think I have to divide the work somehow, but I don't know how.

Upvotes: 0

Views: 163

Answers (1)

Tomalak
Tomalak

Reputation: 338336

You have a bunch of URLs to check and you want to look at the response headers.

There is no need whatsoever to use puppeteer for this, let alone launch a full browser for each of your URLs. This is completely pointless and incredibly wasteful. Sending one HTTP request per URL is enough.

Using the request-promise module, it's a quite straight-forward task.

const request = require('request-promise');

function readCSV(csv) {                               // -> 'a,b,c\na,b,c'
    var lines = csv.split("\n");                      // -> ['a,b,c', 'a,b,c']
    var table = lines.map(line => line.split(","));   // -> [['a', 'b', 'c'], ['a', 'b' ,'c']]
    var requests = table.map(row => request({         // -> [request, request]
        method: 'GET',
        uri: "https://www." + row[1],
        resolveWithFullResponse: true
    }));

    return Promise.all(requests).then(responses => {  // -> [response, response]
        console.log('it worked');
        responses.forEach(response => {
            var hrds = response.headers;
            // hrds is an object. print it, extract info from it, whatever
            // don't forget to look at the other properties of `response`, as well
        });
    }).catch(err => console.log(err));
}

Think of a better name than readCSV, because reading CSV is not what the function does.

Upvotes: 1

Related Questions