Reputation: 129
I'd like to be able to start execution of an array of Promises with Promise.all or Promise.map (Bluebird), while that array is still being populated and then await the results when done populating the array. This is useful when dealing with large datasets that take a long time to load or could cause memory constraints.
Example of how it works today, using a mongodb cursor to load data (using bluebird's Promise.map):
// This is a time-intensive operation
// and requires loading all values into
// memory before proceeding
const dataArray = await cursor.toArray();
// Start doing work after all objects are in memory
await Promise.map(dataArray, doSomeWork);
// done
Example of how I'd like it to work:
const dataArray = [];
const minItems = N;
let promiseMap = null;
// Populate the data array one item at a time
while (await cursor.hasNext()) {
dataArray.push(await cursor.next());
if (!promiseMap && dataArray.length > minItems) {
// Start doing work once there is some data to
// work with and keep filling the array
promiseMap = Promise.map(dataArray, doSomeWork);
}
}
// Once done filling the array, wait for all promises to resolve
await promiseMap;
// done
Is this possible or is it a hard requirement to have a static array of promises before proceeding with execution?
Upvotes: 1
Views: 155