AGkumar
AGkumar

Reputation: 43

Javascript - Call function in parallel and return combined result

This is my first question and I'm trying to learn javascript/nodejs

I have an array x.

var x = [1,2,3,4];

Also I have a function which takes in a param, does some processing and returns a json.

function funcName (param){
  //does some external API calls and returns a JSON
  return result;
}

Now what I'm looking for is rather than iterating over the array and calling the function again and again, is there a way to call them in parallel and then join the result and return it together ?

Also I'm looking for ways to catch the failed function executions. for ex: funcName(3) fails for some reason

Upvotes: 4

Views: 1962

Answers (2)

Get Off My Lawn
Get Off My Lawn

Reputation: 36299

What you could do is create a file that does your heavy lifting, then run a fork of that file.

In this function we do the following:

  • loop over each value in the array and create a promise that we will store in an array
  • Next we create a fork
  • We then send data to the fork using cp.send()
  • Wait for a response back and resolve the promise
  • Using promise.all we can tell when all our child processes have completed
  • The first parameter will be an array of all the child process results

So our main process will look a little something like this:

const { fork } = require('child_process')

let x = [1,2,3,4]

function process(x) {
  let promises = []
  for (let i = 0; i < x.length; i++) {
    promises.push(new Promise(resolve => {
      let cp = fork('my_process.js', [i])
      cp.on('message', data => {
        cp.kill()
        resolve(data)
      })
    }))
  }
  Promise.all(promises).then(data => {
    console.log(data)
  })
}

process(x)

Now in our child we can listen for messages, and do our heavy lifting and return the result back like so (very simple example):

// We got some data lets process it 
result = []
switch (process.argv[1]) {
case 1:
  result = [1, 1, 1, 1, 1, 1]
  break
case 2:
  result = [2, 2, 2, 2, 2, 2]
  break
}
 // Send the result back to the main process
process.send(result)

Upvotes: 2

Aluan Haddad
Aluan Haddad

Reputation: 31803

The comments and other answer are correct. JavaScript has no parallel processing capability whatsoever (forking processes doesn't count).

However, you can make the API calls in a vaguely parallel fashion. Since, as they are asynchronous, the network IO can be interleaved.

Consider the following:

const urls = ['api/items/1', 'api/items/2', etc];

Promise.all(urls.map(fetch))
 .then(results => {
   processResults(results);
});

While that won't execute JavaScript instructions in parallel, the asynchronous fetch calls will not wait for eachother to complete but will be interleaved and the results will be collected when all have completed.

With error handling:

const urls = ['api/items/1', 'api/items/2', etc];

Promise.all(urls.map(fetch).map(promise => promise.catch(() => undefined))
 .then(results => results.filter(result => result !== undefined))
 .then(results => {
   processResults(results);
});

Upvotes: 0

Related Questions