Reputation: 91
I am using nodejs in order to do some computations (No express app!).
My file "calculations.js" is going through a huge array of data and does a lot of calculations. In the end, I receive a simple result.
As the computations are heavy and only one of my 16 cores are used, it still takes a lot of time on my rig.
How can I utilize my 16 cores? All examples with regard to the Cluster Module of nodejs always assume some kind of http connection.
I tried the following without luck:
//cluster.js
var cluster = require('cluster');
var numCPUs = 16;
if (cluster.isMaster) {
for (var i = 0; i < numCPUs; i++) {
cluster.fork();
}
} else {
require('./calculations');
}
EDIT:
//calculations.js
const array = [{name: "target"},{name:"walmart"},{}, etc. ] // 150 objects
array.map((supermarket) => {
calculationFunction(supermarket.name);
}
EDIT: For me it would help a lot, if the array.map function would be somehow "clustered" - meaning that the 150 calls of the function "calculationFunction" would be distributed over my 16 cores. That should be easily possible, no!?
It seems that my file "calculations" is now just started 16x, but in the same way. So the time for the process actually got much longer!?
Help is very much appreciated.
Kind regards
Upvotes: 0
Views: 478
Reputation: 3369
well, this is one of the limitations of node.js and most other languages too. because node is single-threaded it only uses one of your machines core and simulate multi-threading using asynchronous/non-blocking design.
Cluster wont help you because it creates multiple instance of your script on each core but they all are doing the same task each without communication so its useless for you.
what you could do is divide your code into multiple chunks of standalone code and create a child-process to run on different cores so that the code uses your machines code more effectively.
Or rewrite it in golang :) which would make it easy for you to utilize all your machine's cores.
Upvotes: 1