Reputation: 430
I have a function that makes a REST call to a service and returns a promise. lets call that function Execute(). The function takes an ID and sends the ID as a GET parameter to a REST end point which persists the ID in a mongoDB db with some additional info.
In the client, I will need to run "Execute" 100k times from IDs (0 to 100k) and show the status of each (Whether succeeded or failed).
I did the obvious and I created a loop from 0 to 100k and run execute passing "i. That caused my Chrome to eventually freeze running out of memory (insufficient resources). It also caused network congestion from all the rest calls going at the back end.
So I wanted to "chop" those 100k into manageable amount like 50 promises call each. And when those 50 are all done (whether failed or succeeded) I want to use Promise.all([]).then execute the next 50 until all the 100k are done. This way I control the network congestion and memory at the same time. However I can't seem to know how to shake this down. Here is my code.
let promises = []
for (let i = 0; i < 100000, i++)
{
promises.push(execute(i))
if (i % 50 === 0)
{
Promise.all(promises)
.then (a => updateStatus (a, true))
.catch (a => updateStatus (a, false))
}
}
The asynchronous nature of Javascript will keep executing the rest of the loop and executing. I really don't want to put a timer to hold the loop every 50 iterations because this will block the UI and kind of turned my app synchronous. Any suggestions as to how I tackle this?
Thank You Very Much.
New to Javascript.
Upvotes: 8
Views: 15118
Reputation: 23029
With promises itself, this can be done only with recursion.
If you can use new version of Node.js, use async await, it will work as you expect, then you can use await Promise.all(promises)
If you cant, then there is nice library (called Async
) that can execute 50 asynchronous calls at once with this method: https://caolan.github.io/async/v3/docs.html#parallelLimit
It is even better than chunks, because if you have 1 slow callback in chunk, it will block everything else. With parallel limit, it just keep executing 50 callbacks all the time. (however you can just pre-create chunks by 50 if you insist on them and use .series method)
Upvotes: 4
Reputation: 3673
You can wrap promise
by function and push it an array
.
After splitting the array
in chunks and process with reduce
.
Npm package.
https://www.npmjs.com/package/concurrency-promise
Upvotes: 0
Reputation: 1
You can use async/await
to perform asynchronous task in sequential order, schedule a call to the same function is the original array of contains elements, else return array of results
let arr = Array.from({
length: 2000
}, (_, i) => i);
let requests = arr.slice(0);
let results = [];
let fn = async(chunks, results) => {
let curr;
try {
curr = await Promise.all(chunks.map(prop =>
new Promise(resolve => setTimeout(resolve, 500, prop))));
results.push(curr);
console.log(curr);
} catch(err) {
throw err
}
return curr !== undefined && requests.length
? fn(requests.splice(0, 50), results)
: results
}
fn(requests.splice(0, 50), results)
.then(data => console.log(data))
.catch(err => console.error(err))
Upvotes: 4