Zeeshan Shamsuddeen
Zeeshan Shamsuddeen

Reputation: 627

How does NodeJS handle parallel execution for same functions in a loop?

I am running the same asynchronous function in a loop without awaiting the function in a system with 2 CPUs. When I logged to the process id from inside the function, all of them have the same process id,

How does Node handle such parallel executions? Are both the CPUs being using perfectly? Do I need to manually fork process for each function in the loop?


function next(){
 console.log('main started', process.$
 const arr=[];
  for(let i=0; i<10000000; i++)    
                arr.push(1);
  arr.sort(function(a, b){return a<b});
 console.log('main ended')
}

function main(){
 next();
 next()
 next();
 next();
 next();
 next();
 console.log('-----------------------------$
}

main()


HTOP Screenshot

Upvotes: 0

Views: 3030

Answers (2)

jfriend00
jfriend00

Reputation: 707328

Node.js runs your actual Javascript in a single thread so it does not apply more than one CPU to your actual Javascript unless you specially design your code to put the CPU intensive tasks in Worker threads or you create your own separate processes with clustering or by using the child_process module to fire up your own additional processes and then farm work out to them. Just running your node.js program, a CPU intensive operation (like your long loop of sorting) will hog the CPU and block the event loop from processing other requests. It will not involve other CPUs in doing that sorting operation and will not use other CPUs for your Javascript.

When you run an asynchronous operation, there will be native code behind that operation and that native code may or may not use additional threads or processes. For example, file I/O uses a thread pool. Networking uses native OS asynchronous support (no threads). spawn() or exec() in child_process start new processes.

If you show us the actual code for your specific situation, we can answer more specifically about how that particular operation works.

How does Node handle such parallel executions?

It depends upon what the operation is.

Are both the CPUs being using perfectly?

Probably not, but we'd need to see your specific code.

Do I need to manually fork process for each function in the loop?

It depends upon the specific situation. For applying multiple CPUs to your actual Javascript (not asynchronous operations), then you would need multiple processes running your Javascript or perhaps the newest Worker thread api. If the parallelism is all in asynchronous operations, then the event driven nature of node.js is particularly good at managing many asynchronous operations at once and you may not even benefit from getting multiple CPUs involved because most of the time all node.js is doing is waiting for I/O to complete and many, many requests can already be in flight at the same time very efficiently in node.js.

For actually getting multiple CPUs applied to running your Javascript itself, then node.js has the clustering module which is pretty purpose-built for that. You can fire up a cluster process for each actual CPU core in your computer. Or, you can also use the new Worker thread api.

Also see these answers that discuss how to address CPU intensive code in node.js:

How to process huge array of objects in nodejs

How to apply clustering/spawing child process techniques for Node.js application having bouth IO bound and CPU bound tasks?

node.js socket.io server long latency

Is it possible somehow do multithreading in NodeJS?

How cpu intensive is too much for node.js (worried about blocking event loop)

Upvotes: 2

Sunitha Premakumaran
Sunitha Premakumaran

Reputation: 122

You can use Promise.all that returns the resolved data from this asynchronous function. This way node doesn't wait for each array item to processed.

let results = await Promise.all(
array.map(arrayItem => executeAsynchronousFunctionWith(arrayItem))
);

The variable results is an array of resolved result.

Upvotes: 1

Related Questions