Chris
Chris

Reputation: 4983

What is the best way to limit concurrency when using ES6's Promise.all()?

I have some code that is iterating over a list that was queried out of a database and making an HTTP request for each element in that list. That list can sometimes be a reasonably large number (in the thousands), and I would like to make sure I am not hitting a web server with thousands of concurrent HTTP requests.

An abbreviated version of this code currently looks something like this...

function getCounts() {
  return users.map(user => {
    return new Promise(resolve => {
      remoteServer.getCount(user) // makes an HTTP request
      .then(() => {
        /* snip */
        resolve();
      });
    });
  });
}

Promise.all(getCounts()).then(() => { /* snip */});

This code is running on Node 4.3.2. To reiterate, can Promise.all be managed so that only a certain number of Promises are in progress at any given time?

Upvotes: 236

Views: 177985

Answers (30)

vitaly-t
vitaly-t

Reputation: 25840

The solution below uses iter-ops library, and operator waitRace that controls concurrency:

import {pipeAsync, map, waitRace} from 'iter-ops';

const i = pipeAsync(
    users, // inputs iterator/iterable
    map(u => u.remoteServer.getCount(u)), // create async request
    waitRace(10) // race-resolve up to 10 promises at a time
)
   .catch(err => {/* handle rejections */});

for await(const p of i) {
    //=> p = resolved value
}

Upvotes: 0

user3413723
user3413723

Reputation: 12233

Unfortunately there is no way to do it with native Promise.all, so you have to be creative.

This is the quickest most concise way I could find without using any outside libraries.

It makes use of a newer javascript feature called an iterator. The iterator basically keeps track of what items have been processed and what haven't.

In order to use it in code, you create an array of async functions. Each async function asks the same iterator for the next item that needs to be processed. Each function processes its own item asynchronously, and when done asks the iterator for a new one. Once the iterator runs out of items, all the functions complete.

Thanks to @Endless for inspiration.

const items = [
  'https://httpbin.org/bytes/2',
  'https://httpbin.org/bytes/2',
  'https://httpbin.org/bytes/2',
  'https://httpbin.org/bytes/2',
  'https://httpbin.org/bytes/2',
  'https://httpbin.org/bytes/2',
  'https://httpbin.org/bytes/2',
  'https://httpbin.org/bytes/2'
]

// get a cursor that keeps track of what items have already been processed.
let cursor = items.entries();

// create 5 for loops that each run off the same cursor which keeps track of location
let numWorkers = 5;
Array(numWorkers).fill().forEach(async () => {
    for (let [index, url] of cursor){
        console.log('getting url is ', index, url)
        // run your async task instead of this next line
        var text = await fetch(url).then(res => res.text())
        console.log('text is', text.slice(0, 20))
    }
})

Upvotes: 9

Dan Fabulich
Dan Fabulich

Reputation: 39573

As all others in this answer thread have pointed out, Promise.all() won't do the right thing if you need to limit concurrency. But ideally you shouldn't even want to wait until all of the Promises are done before processing them.

Instead, you want to process each result ASAP as soon as it becomes available, so you don't have to wait for the very last promise to finish before you start iterating over them.

So, here's a code sample that does just that, based partly on the answer by Endless and also on this answer by T.J. Crowder.

EDIT: I've turned this little snippet into a library, concurrency-limit-runner.

// example tasks that sleep and return a number
// in real life, you'd probably fetch URLs or something
const tasks = [];
for (let i = 0; i < 20; i++) {
    tasks.push(async () => {
        console.log(`start ${i}`);
        await sleep(Math.random() * 1000);
        console.log(`end ${i}`);
        return i;
    });
}
function sleep(ms) { return new Promise(r => setTimeout(r, ms)); }

(async () => {
    for await (let value of runTasks(3, tasks.values())) {
        console.log(`output ${value}`);
    }
})();

async function* runTasks(maxConcurrency, taskIterator) {
    async function* createWorkerIterator() {
        // Each AsyncGenerator that this function* creates is a worker,
        // polling for tasks from the shared taskIterator. Sharing the
        // taskIterator ensures that each worker gets unique tasks.
        for (const task of taskIterator) yield await task();
    }

    const asyncIterators = new Array(maxConcurrency);
    for (let i = 0; i < maxConcurrency; i++) {
        asyncIterators[i] = createWorkerIterator();
    }
    yield* raceAsyncIterators(asyncIterators);
}

async function* raceAsyncIterators(asyncIterators) {
    async function nextResultWithItsIterator(iterator) {
        return { result: await iterator.next(), iterator: iterator };
    }
    /** @type Map<AsyncIterator<T>,
        Promise<{result: IteratorResult<T>, iterator: AsyncIterator<T>}>> */
    const promises = new Map();
    for (const iterator of asyncIterators) {
        promises.set(iterator, nextResultWithItsIterator(iterator));
    }
    while (promises.size) {
        const { result, iterator } = await Promise.race(promises.values());
        if (result.done) {
            promises.delete(iterator);
        } else {
            promises.set(iterator, nextResultWithItsIterator(iterator));
            yield result.value;
        }
    }
}

There's a lot of magic in here; let me explain.

This solution is built around async generator functions, which many JS developers may not be familiar with.

A generator function (aka function* function) returns a "generator," an iterator of results. Generator functions are allowed to use the yield keyword where you might have normally used a return keyword. The first time a caller calls next() on the generator (or uses a for...of loop), the function* function runs until it yields a value; that becomes the next() value of the iterator. But the subsequent time next() is called, the generator function resumes from the yield statement, right where it left off, even if it's in the middle of a loop. (You can also yield*, to yield all of the results of another generator function.)

An "async generator function" (async function*) is a generator function that returns an "async iterator," which is an iterator of promises. You can call for await...of on an async iterator. Async generator functions can use the await keyword, as you might do in any async function.

In the example, we call runTasks() with an array of task functions; we call .values() on the array to convert the array into an iterator.

runTasks() is an async generator function, so we can call it with a for await...of loop. Each time the loop runs, we'll process the result of the latest completed task.

runTasks() creates N async iterators, the "workers." Each worker polls for tasks from the shared taskIterator, ensuring that each worker gets a unique task.

The example calls runTasks with 3 concurrent workers, so no more than 3 tasks are launched at the same time. When any task completes, we immediately queue up the next task. (This is superior to "batching", where you do 3 tasks at once, await all three of them, and don't start the next batch of three until the entire previous batch has finished.)

runTasks() concludes by "racing" its async iterators with yield* raceAsyncIterators(). raceAsyncIterators() is like Promise.race() but it races N iterators of Promises instead of just N Promises; it returns an async iterator that yields the results of resolved Promises.

raceAsyncIterators() starts by defining a promises Map from each of the iterators to promises. Each promise is a promise for an iteration result along with the iterator that generated it.

With the promises map, we can Promise.race() the values of the map, giving us the winning iteration result and its iterator. If the iterator is completely done, we remove it from the map; otherwise we replace its Promise in the promises map with the iterator's next() Promise and yield result.value.

In conclusion, runTasks() is an async generator function that yields the results of racing N concurrent async iterators of tasks, so the end user can just for await (let value of runTasks(3, tasks.values())) to process each result as soon as it becomes available.

Upvotes: 23

iCPSoni
iCPSoni

Reputation: 136

If you want to go for external package you can use p-limit

import pLimit from 'p-limit';

const limit = pLimit(1);

const input = [
    limit(() => fetchSomething('foo')),
    limit(() => fetchSomething('bar')),
    limit(() => doSomething())
];

// Only one promise is run at once
const result = await Promise.all(input);
console.log(result);

Upvotes: 1

Endless
Endless

Reputation: 37815

If you know how iterators work and how they are consumed you would't need any extra library, since it can become very easy to build your own concurrency yourself. Let me demonstrate:

/* [Symbol.iterator]() is equivalent to .values()
const iterator = [1,2,3][Symbol.iterator]() */
const iterator = [1,2,3].values()


// loop over all items with for..of
for (const x of iterator) {
  console.log('x:', x)
  
  // notices how this loop continues the same iterator
  // and consumes the rest of the iterator, making the
  // outer loop not logging any more x's
  for (const y of iterator) {
    console.log('y:', y)
  }
}

We can use the same iterator and share it across workers.

If you had used .entries() instead of .values() you would have gotten a iterator that yields [index, value] which i will demonstrate below with a concurrency of 2

const sleep = t => new Promise(rs => setTimeout(rs, t))
const iterator = Array.from('abcdefghij').entries()
// const results = [] || Array(someLength)

async function doWork (iterator, i) {
  for (let [index, item] of iterator) {
    await sleep(1000)
    console.log(`Worker#${i}: ${index},${item}`)

    // in case you need to store the results in order
    // results[index] = item + item

    // or if the order dose not mather
    // results.push(item + item)
  }
}

const workers = Array(2).fill(iterator).map(doWork)
//    ^--- starts two workers sharing the same iterator

Promise.allSettled(workers).then(console.log.bind(null, 'done'))

The benefit of this is that you can have a generator function instead of having everything ready at once.

What's even more awesome is that you can do stream.Readable.from(iterator) in node (and eventually in whatwg streams as well). and with transferable ReadbleStream, this makes this potential very useful in the feature if you are working with web workers also for performances


Note: the different from this compared to example async-pool is that it spawns two workers, so if one worker throws an error for some reason at say index 5 it won't stop the other worker from doing the rest. So you go from doing 2 concurrency down to 1. (so it won't stop there) So my advise is that you catch all errors inside the doWork function

Upvotes: 91

Anton Fil
Anton Fil

Reputation: 305

No external libraries. Just plain JS.

It can be resolved using recursion.

The idea is that initially we immediately execute the maximum allowed number of queries and each of these queries should recursively initiate a new query on its completion.

In this example I populate successful responses together with errors and I execute all queries but it's possible to slightly modify algorithm if you want to terminate batch execution on the first failure.

async function batchQuery(queries, limit) {
  limit = Math.min(queries.length, limit);

  return new Promise((resolve, reject) => {
    const responsesOrErrors = new Array(queries.length);
    let startedCount = 0;
    let finishedCount = 0;
    let hasErrors = false;

    function recursiveQuery() {
      let index = startedCount++;

      doQuery(queries[index])
        .then(res => {
          responsesOrErrors[index] = res;
        })
        .catch(error => {
          responsesOrErrors[index] = error;
          hasErrors = true;
        })
        .finally(() => {
          finishedCount++;
          if (finishedCount === queries.length) {
            hasErrors ? reject(responsesOrErrors) : resolve(responsesOrErrors);
          } else if (startedCount < queries.length) {
            recursiveQuery();
          }
        });
    }

    for (let i = 0; i < limit; i++) {
      recursiveQuery();
    }
  });
}

async function doQuery(query) {
  console.log(`${query} started`);
  const delay = Math.floor(Math.random() * 1500);
  return new Promise((resolve, reject) => {
    setTimeout(() => {
      if (delay <= 1000) {
        console.log(`${query} finished successfully`);
        resolve(`${query} success`);
      } else {
        console.log(`${query} finished with error`);
        reject(`${query} error`);
      }
    }, delay);
  });
}

const queries = new Array(10).fill('query').map((query, index) => `${query}_${index + 1}`);

batchQuery(queries, 3)
  .then(responses => console.log('All successfull', responses))
  .catch(responsesWithErrors => console.log('All with several failed', responsesWithErrors));

Upvotes: 1

Sean
Sean

Reputation: 2529

I know there are a lot of answers already, but I ended up using a very simple, no library or sleep required, solution that uses only a few commands. Promise.all() simply lets you know when all the promises passed to it are finalized. So, you can check on the queue intermittently to see if it is ready for more work, if so, add more processes.

For example:

// init vars
const batchSize = 5
const calls = []
// loop through data and run processes  
for (let [index, data] of [1,2,3].entries()) {
   // pile on async processes 
   calls.push(doSomethingAsyncWithData(data))
   // every 5th concurrent call, wait for them to finish before adding more
   if (index % batchSize === 0) await Promise.all(calls)
}
// clean up for any data to process left over if smaller than batch size
const allFinishedProcs = await Promise.all(calls)

Upvotes: 1

leonbloy
leonbloy

Reputation: 75926

Here's my recipe, based on killdash9's answer. It allows to choose the behaviour on exceptions (Promise.all vs Promise.allSettled).

// Given an array of async functions, runs them in parallel,
// with at most maxConcurrency simultaneous executions
// Except for that, behaves the same as Promise.all,
// unless allSettled is true, where it behaves as Promise.allSettled  

function concurrentRun(maxConcurrency = 10, funcs = [], allSettled = false) {
  if (funcs.length <= maxConcurrency) {
    const ps = funcs.map(f => f());
    return allSettled ? Promise.allSettled(ps) : Promise.all(ps);
  }
  return new Promise((resolve, reject) => {
    let idx = -1;
    const ps = new Array(funcs.length);
    function nextPromise() {
      idx += 1;
      if (idx < funcs.length) {
        (ps[idx] = funcs[idx]()).then(nextPromise).catch(allSettled ? nextPromise : reject);
      } else if (idx === funcs.length) {
        (allSettled ? Promise.allSettled(ps) : Promise.all(ps)).then(resolve).catch(reject);
      }
    }
    for (let i = 0; i < maxConcurrency; i += 1) nextPromise();
  });
}

Upvotes: 0

killdash9
killdash9

Reputation: 2462

The concurrent function below will return a Promise which resolves to an array of resolved promise values, while implementing a concurrency limit. No 3rd party library.

// waits 50 ms then resolves to the passed-in arg
const sleepAndResolve = s => new Promise(rs => setTimeout(()=>rs(s), 50))

// queue 100 promises
const funcs = []
for(let i=0; i<100; i++) funcs.push(()=>sleepAndResolve(i))

//run the promises with a max concurrency of 10
concurrent(10,funcs) 
.then(console.log) // prints [0,1,2...,99]
.catch(()=>console.log("there was an error"))

/**
 * Run concurrent promises with a maximum concurrency level
 * @param concurrency The number of concurrently running promises
 * @param funcs An array of functions that return promises
 * @returns a promise that resolves to an array of the resolved values from the promises returned by funcs
 */
function concurrent(concurrency, funcs) {
    return new Promise((resolve, reject) => {
        let index = -1;
        const p = [];
        for (let i = 0; i < Math.max(1, Math.min(concurrency, funcs.length)); i++)
            runPromise();
        function runPromise() {
            if (++index < funcs.length)
                (p[p.length] = funcs[index]()).then(runPromise).catch(reject);
            else if (index === funcs.length)
                Promise.all(p).then(resolve).catch(reject);
        }
    });
}

Here's the Typescript version if you are interested

/**
 * Run concurrent promises with a maximum concurrency level
 * @param concurrency The number of concurrently running promises
 * @param funcs An array of functions that return promises
 * @returns a promise that resolves to an array of the resolved values from the promises returned by funcs
 */
function concurrent<V>(concurrency:number, funcs:(()=>Promise<V>)[]):Promise<V[]> {
  return new Promise((resolve,reject)=>{
    let index = -1;
    const p:Promise<V>[] = []
    for(let i=0; i<Math.max(1,Math.min(concurrency, funcs.length)); i++) runPromise()
    function runPromise() {
      if (++index < funcs.length) (p[p.length] = funcs[index]()).then(runPromise).catch(reject)
      else if (index === funcs.length) Promise.all(p).then(resolve).catch(reject)
    }
  })
}

Upvotes: 2

AlexRMU
AlexRMU

Reputation: 67

I suggest not downloading packages and not writing hundreds of lines of code:

async function async_arr<T1, T2>(
    arr: T1[],
    func: (x: T1) => Promise<T2> | T2, //can be sync or async
    limit = 5
) {
    let results: T2[] = [];
    let workers = [];
    let current = Math.min(arr.length, limit);
    async function process(i) {
        if (i < arr.length) {
            results[i] = await Promise.resolve(func(arr[i]));
            await process(current++);
        }
    }
    for (let i = 0; i < current; i++) {
        workers.push(process(i));
    }
    await Promise.all(workers);
    return results;
}

Upvotes: 0

Vytenis Urbonavičius
Vytenis Urbonavičius

Reputation: 11

It is possible to limit requests to server by using https://www.npmjs.com/package/job-pipe

Basically you create a pipe and tell it how many concurrent requests you want:

const pipe = createPipe({ throughput: 6, maxQueueSize: Infinity })

Then you take your function which performs call and force it through the pipe to create a limited amount of calls at the same time:

const makeCall = async () => {...}
const limitedMakeCall = pipe(makeCall)

Finally, you call this method as many times as you need as if it was unchanged and it will limit itself on how many parallel executions it can handle:

await limitedMakeCall()
await limitedMakeCall()
await limitedMakeCall()
await limitedMakeCall()
await limitedMakeCall()
....
await limitedMakeCall()

Profit.

Upvotes: 0

TimoStaudinger
TimoStaudinger

Reputation: 42450

Note that Promise.all() doesn't trigger the promises to start their work, creating the promise itself does.

With that in mind, one solution would be to check whenever a promise is resolved whether a new promise should be started or whether you're already at the limit.

However, there is really no need to reinvent the wheel here. One library that you could use for this purpose is es6-promise-pool. From their examples:

var PromisePool = require('es6-promise-pool')
 
var promiseProducer = function () {
  // Your code goes here. 
  // If there is work left to be done, return the next work item as a promise. 
  // Otherwise, return null to indicate that all promises have been created. 
  // Scroll down for an example. 
}
 
// The number of promises to process simultaneously. 
var concurrency = 3
 
// Create a pool. 
var pool = new PromisePool(promiseProducer, concurrency)
 
// Start the pool. 
var poolPromise = pool.start()
 
// Wait for the pool to settle. 
poolPromise.then(function () {
  console.log('All promises fulfilled')
}, function (error) {
  console.log('Some promise rejected: ' + error.message)
})

Upvotes: 79

Safareli
Safareli

Reputation: 880

Semaphore is well known concurrency primitive that was designed to solve similar problems. It's very universal construct, implementations of Semaphore exist in many languages. This is how one would use Semaphore to solve this issue:

async function main() {
  const s = new Semaphore(100);
  const res = await Promise.all(
    entities.map((users) => 
      s.runExclusive(() => remoteServer.getCount(user))
    )
  );
  return res;
}

I'm using implementation of Semaphore from async-mutex, it has decent documentation and TypeScript support.

If you want to dig deep into topics like this you can take a look at the book "The Little Book of Semaphores" which is freely available as PDF here

Upvotes: 3

Rafael Xavier
Rafael Xavier

Reputation: 2889

Using tiny-async-pool ES9 for await...of API, you can do the following:

const asyncPool = require("tiny-async-pool");
const getCount = async (user) => ([user, remoteServer.getCount(user)]);
const concurrency = 2;

for await (const [user, count] of asyncPool(concurrency, users, getCount)) {
  console.log(user, count);
}

The above asyncPool function returns an async iterator that yields as soon as a promise completes (under concurrency limit) and it rejects immediately as soon as one of the promises rejects.

Upvotes: 1

Andrew Odri
Andrew Odri

Reputation: 9432

This solution uses an async generator to manage concurrent promises with vanilla javascript. The throttle generator takes 3 arguments:

  • An array of values to be be supplied as arguments to a promise genrating function. (e.g. An array of URLs.)
  • A function that return a promise. (e.g. Returns a promise for an HTTP request.)
  • An integer that represents the maximum concurrent promises allowed.

Promises are only instantiated as required in order to reduce memory consumption. Results can be iterated over using a for await...of statement.

The example below provides a function to check promise state, the throttle async generator, and a simple function that return a promise based on setTimeout. The async IIFE at the end defines the reservoir of timeout values, sets the async iterable returned by throttle, then iterates over the results as they resolve.

If you would like a more complete example for HTTP requests, let me know in the comments.

Please note that Node.js 16+ is required in order async generators.

const promiseState = function( promise ) {
  const control = Symbol();

  return Promise
    .race([ promise, control ])
    .then( value => ( value === control ) ? 'pending' : 'fulfilled' )
    .catch( () => 'rejected' );
}

const throttle = async function* ( reservoir, promiseClass, highWaterMark ) {
  let iterable = reservoir.splice( 0, highWaterMark ).map( item => promiseClass( item ) );

  while ( iterable.length > 0 ) {
    await Promise.any( iterable );

    const pending = [];
    const resolved = [];

    for ( const currentValue of iterable ) {
      if ( await promiseState( currentValue ) === 'pending' ) {
        pending.push( currentValue );
      } else {
        resolved.push( currentValue );
      }
    }

    console.log({ pending, resolved, reservoir });

    iterable = [
      ...pending,
      ...reservoir.splice( 0, highWaterMark - pending.length ).map( value => promiseClass( value ) )
    ];

    yield Promise.allSettled( resolved );
  }
}

const getTimeout = delay => new Promise( ( resolve, reject ) => {
  setTimeout(resolve, delay, delay);
} );

( async () => {
  const test = [ 1100, 1200, 1300, 10000, 11000, 9000, 5000, 6000, 3000, 4000, 1000, 2000, 3500 ];

  const throttledRequests = throttle( test, getTimeout, 4 );

  for await ( const timeout of throttledRequests ) {
    console.log( timeout );
  }
} )();

Upvotes: 1

hurricane
hurricane

Reputation: 6724

I have solution with creating chunks and using .reduce function to wait each chunks promise.alls to be finished. And also I add some delay if the promises have some call limits.

export function delay(ms: number) {
  return new Promise<void>((resolve) => setTimeout(resolve, ms));
}

export const chunk = <T>(arr: T[], size: number): T[][] => [
  ...Array(Math.ceil(arr.length / size)),
].map((_, i) => arr.slice(size * i, size + size * i));

const myIdlist = []; // all items
const groupedIdList = chunk(myIdList, 20); // grouped by 20 items

await groupedIdList.reduce(async (prev, subIdList) => {
  await prev;
  // Make sure we wait for 500 ms after processing every page to prevent overloading the calls.
  const data = await Promise.all(subIdList.map(myPromise));
  await delay(500);
}, Promise.resolve());

Upvotes: 0

hiddensunset4
hiddensunset4

Reputation: 6029

Using Array.prototype.splice

while (funcs.length) {
  // 100 at a time
  await Promise.all( funcs.splice(0, 100).map(f => f()) )
}

Upvotes: 135

Simo
Simo

Reputation: 474

If you goal is to slow down the Promise.all to avoid Rate limiting, or overloading:

Here's my implementation

async function promiseAllGentle(arr, batchSize = 5, sleep = 50) {
  let output = [];
  while (arr.length) {
    const batchResult = await Promise.all(arr.splice(0, batchSize));
    output = [...output, ...batchResult];
    await new Promise((res) => setTimeout(res, sleep));
  }
  return output;
}

Upvotes: -2

cphoover
cphoover

Reputation: 378

Warning this has not been benchmarked for efficiency and does a lot of array copying/creation

If you want a more functional approach you could do something like:

import chunk from 'lodash.chunk';

const maxConcurrency = (max) => (dataArr, promiseFn) =>
  chunk(dataArr, max).reduce(
      async (agg, batch) => [
          ...(await agg),
          ...(await Promise.all(batch.map(promiseFn)))
      ],
      []
  );

and then to you could use it like:

const randomFn = (data) =>
    new Promise((res) => setTimeout(
      () => res(data + 1),
        Math.random() * 1000
      ));


const result = await maxConcurrency(5)(
    [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],
    randomFn
);
console.log('result+++', result);

Upvotes: 0

cmhteixeira
cmhteixeira

Reputation: 967

  • @tcooc's answer was quite cool. Didn't know about it and will leverage it in the future.
  • I also enjoyed @MatthewRideout's answer, but it uses an external library!!

Whenever possible, I give a shot at developing this kind of things on my own, rather than going for a library. You end up learning a lot of concepts which seemed daunting before.

 class Pool{
        constructor(maxAsync) {
            this.maxAsync = maxAsync;
            this.asyncOperationsQueue = [];
            this.currentAsyncOperations = 0
        }

        runAnother() {
            if (this.asyncOperationsQueue.length > 0 && this.currentAsyncOperations < this.maxAsync) {
                this.currentAsyncOperations += 1;
                this.asyncOperationsQueue.pop()()
                    .then(() => { this.currentAsyncOperations -= 1; this.runAnother() }, () => { this.currentAsyncOperations -= 1; this.runAnother() })
            }
        }

        add(f){  // the argument f is a function of signature () => Promise
            this.runAnother();
            return new Promise((resolve, reject) => {
                this.asyncOperationsQueue.push(
                    () => f().then(resolve).catch(reject)
                )
            })
        }
    }

//#######################################################
//                        TESTS
//#######################################################

function dbCall(id, timeout, fail) {
    return new Promise((resolve, reject) => {
        setTimeout(() => {
            if (fail) {
               reject(`Error for id ${id}`);
            } else {
                resolve(id);
            }
        }, timeout)
    }
    )
}


const dbQuery1 = () => dbCall(1, 5000, false);
const dbQuery2 = () => dbCall(2, 5000, false);
const dbQuery3 = () => dbCall(3, 5000, false);
const dbQuery4 = () => dbCall(4, 5000, true);
const dbQuery5 = () => dbCall(5, 5000, false);


const cappedPool = new Pool(2);

const dbQuery1Res = cappedPool.add(dbQuery1).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery2Res = cappedPool.add(dbQuery2).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery3Res = cappedPool.add(dbQuery3).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery4Res = cappedPool.add(dbQuery4).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery5Res = cappedPool.add(dbQuery5).catch(i => i).then(i => console.log(`Resolved: ${i}`))

This approach provides a nice API, similar to thread pools in scala/java.
After creating one instance of the pool with const cappedPool = new Pool(2), you provide promises to it with simply cappedPool.add(() => myPromise).
Obliviously we must ensure that the promise does not start immediately and that is why we must "provide it lazily" with the help of the function.

Most importantly, notice that the result of the method add is a Promise which will be completed/resolved with the value of your original promise! This makes for a very intuitive use.

const resultPromise = cappedPool.add( () => dbCall(...))
resultPromise
.then( actualResult => {
   // Do something with the result form the DB
  }
)

Upvotes: 1

Dmitriy Mozgovoy
Dmitriy Mozgovoy

Reputation: 1597

One more solution with a custom promise library (CPromise):

    import { CPromise } from "c-promise2";
    import cpFetch from "cp-fetch";
    
    const promise = CPromise.all(
      function* () {
        const urls = [
          "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=1",
          "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=2",
          "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=3",
          "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=4",
          "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=5",
          "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=6",
          "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=7"
        ];
    
        for (const url of urls) {
          yield cpFetch(url); // add a promise to the pool
          console.log(`Request [${url}] completed`);
        }
      },
      { concurrency: 2 }
    ).then(
      (v) => console.log(`Done: `, v),
      (e) => console.warn(`Failed: ${e}`)
    );
    
    // yeah, we able to cancel the task and abort pending network requests
    // setTimeout(() => promise.cancel(), 4500);

    import { CPromise } from "c-promise2";
    import cpFetch from "cp-fetch";
    
    const promise = CPromise.all(
      [
        "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=1",
        "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=2",
        "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=3",
        "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=4",
        "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=5",
        "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=6",
        "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=7"
      ],
      {
        mapper: (url) => {
          console.log(`Request [${url}]`);
          return cpFetch(url);
        },
        concurrency: 2
      }
    ).then(
      (v) => console.log(`Done: `, v),
      (e) => console.warn(`Failed: ${e}`)
    );
    
    // yeah, we able to cancel the task and abort pending network requests
    //setTimeout(() => promise.cancel(), 4500);

Upvotes: 0

Eugene Blinn
Eugene Blinn

Reputation: 323

expanding on the answer posted by @deceleratedcaviar, I created a 'batch' utility function that takes as argument: array of values, concurrency limit and processing function. Yes I realize that using Promise.all this way is more akin to batch processing vs true concurrency, but if the goal is to limit excessive number of HTTP calls at one time I go with this approach due to its simplicity and no need for external library.

async function batch(o) {
  let arr = o.arr
  let resp = []
  while (arr.length) {
    let subset = arr.splice(0, o.limit)
    let results = await Promise.all(subset.map(o.process))
    resp.push(results)
  }
  return [].concat.apply([], resp)
}

let arr = []
for (let i = 0; i < 250; i++) { arr.push(i) }

async function calc(val) { return val * 100 }

(async () => {
  let resp = await batch({
    arr: arr,
    limit: 100,
    process: calc
  })
  console.log(resp)
})();

Upvotes: 0

Kristian Oye
Kristian Oye

Reputation: 1202

So many good solutions. I started out with the elegant solution posted by @Endless and ended up with this little extension method that does not use any external libraries nor does it run in batches (although assumes you have features like async, etc):

Promise.allWithLimit = async (taskList, limit = 5) => {
    const iterator = taskList.entries();
    let results = new Array(taskList.length);
    let workerThreads = new Array(limit).fill(0).map(() => 
        new Promise(async (resolve, reject) => {
            try {
                let entry = iterator.next();
                while (!entry.done) {
                    let [index, promise] = entry.value;
                    try {
                        results[index] = await promise;
                        entry = iterator.next();
                    }
                    catch (err) {
                        results[index] = err;
                    }
                }
                // No more work to do
                resolve(true); 
            }
            catch (err) {
                // This worker is dead
                reject(err);
            }
        }));

    await Promise.all(workerThreads);
    return results;
};

    Promise.allWithLimit = async (taskList, limit = 5) => {
        const iterator = taskList.entries();
        let results = new Array(taskList.length);
        let workerThreads = new Array(limit).fill(0).map(() => 
            new Promise(async (resolve, reject) => {
                try {
                    let entry = iterator.next();
                    while (!entry.done) {
                        let [index, promise] = entry.value;
                        try {
                            results[index] = await promise;
                            entry = iterator.next();
                        }
                        catch (err) {
                            results[index] = err;
                        }
                    }
                    // No more work to do
                    resolve(true); 
                }
                catch (err) {
                    // This worker is dead
                    reject(err);
                }
            }));
    
        await Promise.all(workerThreads);
        return results;
    };

    const demoTasks = new Array(10).fill(0).map((v,i) => new Promise(resolve => {
       let n = (i + 1) * 5;
       setTimeout(() => {
          console.log(`Did nothing for ${n} seconds`);
          resolve(n);
       }, n * 1000);
    }));

    var results = Promise.allWithLimit(demoTasks);

Upvotes: 1

Adelost
Adelost

Reputation: 2453

Here is my ES7 solution to a copy-paste friendly and feature complete Promise.all()/map() alternative, with a concurrency limit.

Similar to Promise.all() it maintains return order as well as a fallback for non promise return values.

I also included a comparison of the different implementation as it illustrates some aspects a few of the other solutions have missed.

Usage

const asyncFn = delay => new Promise(resolve => setTimeout(() => resolve(), delay));
const args = [30, 20, 15, 10];
await asyncPool(args, arg => asyncFn(arg), 4); // concurrency limit of 4

Implementation

async function asyncBatch(args, fn, limit = 8) {
  // Copy arguments to avoid side effects
  args = [...args];
  const outs = [];
  while (args.length) {
    const batch = args.splice(0, limit);
    const out = await Promise.all(batch.map(fn));
    outs.push(...out);
  }
  return outs;
}

async function asyncPool(args, fn, limit = 8) {
  return new Promise((resolve) => {
    // Copy arguments to avoid side effect, reverse queue as
    // pop is faster than shift
    const argQueue = [...args].reverse();
    let count = 0;
    const outs = [];
    const pollNext = () => {
      if (argQueue.length === 0 && count === 0) {
        resolve(outs);
      } else {
        while (count < limit && argQueue.length) {
          const index = args.length - argQueue.length;
          const arg = argQueue.pop();
          count += 1;
          const out = fn(arg);
          const processOut = (out, index) => {
            outs[index] = out;
            count -= 1;
            pollNext();
          };
          if (typeof out === 'object' && out.then) {
            out.then(out => processOut(out, index));
          } else {
            processOut(out, index);
          }
        }
      }
    };
    pollNext();
  });
}

Comparison

// A simple async function that returns after the given delay
// and prints its value to allow us to determine the response order
const asyncFn = delay => new Promise(resolve => setTimeout(() => {
  console.log(delay);
  resolve(delay);
}, delay));

// List of arguments to the asyncFn function
const args = [30, 20, 15, 10];

// As a comparison of the different implementations, a low concurrency
// limit of 2 is used in order to highlight the performance differences.
// If a limit greater than or equal to args.length is used the results
// would be identical.

// Vanilla Promise.all/map combo
const out1 = await Promise.all(args.map(arg => asyncFn(arg)));
// prints: 10, 15, 20, 30
// total time: 30ms

// Pooled implementation
const out2 = await asyncPool(args, arg => asyncFn(arg), 2);
// prints: 20, 30, 15, 10
// total time: 40ms

// Batched implementation
const out3 = await asyncBatch(args, arg => asyncFn(arg), 2);
// prints: 20, 30, 20, 30
// total time: 45ms

console.log(out1, out2, out3); // prints: [30, 20, 15, 10] x 3

// Conclusion: Execution order and performance is different,
// but return order is still identical

Conclusion

asyncPool() should be the best solution as it allows new requests to start as soon as any previous one finishes.

asyncBatch() is included as a comparison as its implementation is simpler to understand, but it should be slower in performance as all requests in the same batch is required to finish in order to start the next batch.

In this contrived example, the non-limited vanilla Promise.all() is of course the fastest, while the others could perform more desirable in a real world congestion scenario.

Update

The async-pool library that others have already suggested is probably a better alternative to my implementation as it works almost identically and has a more concise implementation with a clever usage of Promise.race(): https://github.com/rxaviers/async-pool/blob/master/lib/es7.js

Hopefully my answer can still serve an educational value.

Upvotes: 3

Venryx
Venryx

Reputation: 17999

I suggest the library async-pool: https://github.com/rxaviers/async-pool

npm install tiny-async-pool

Description:

Run multiple promise-returning & async functions with limited concurrency using native ES6/ES7

asyncPool runs multiple promise-returning & async functions in a limited concurrency pool. It rejects immediately as soon as one of the promises rejects. It resolves when all the promises completes. It calls the iterator function as soon as possible (under concurrency limit).

Usage:

const timeout = i => new Promise(resolve => setTimeout(() => resolve(i), i));
await asyncPool(2, [1000, 5000, 3000, 2000], timeout);
// Call iterator (i = 1000)
// Call iterator (i = 5000)
// Pool limit of 2 reached, wait for the quicker one to complete...
// 1000 finishes
// Call iterator (i = 3000)
// Pool limit of 2 reached, wait for the quicker one to complete...
// 3000 finishes
// Call iterator (i = 2000)
// Itaration is complete, wait until running ones complete...
// 5000 finishes
// 2000 finishes
// Resolves, results are passed in given array order `[1000, 5000, 3000, 2000]`.

Upvotes: 4

Marcelo
Marcelo

Reputation: 1592

This is what I did using Promise.race, inside my code here

const identifyTransactions = async function() {
  let promises = []
  let concurrency = 0
  for (let tx of this.transactions) {
    if (concurrency > 4)
      await Promise.race(promises).then(r => { promises = []; concurrency = 0 })
    promises.push(tx.identifyTransaction())
    concurrency++
  }
  if (promises.length > 0)
    await Promise.race(promises) //resolve the rest
}

If you wanna see an example: https://jsfiddle.net/thecodermarcelo/av2tp83o/5/

Upvotes: -1

Juan
Juan

Reputation: 816

Recursion is the answer if you don't want to use external libraries

downloadAll(someArrayWithData){
  var self = this;

  var tracker = function(next){
    return self.someExpensiveRequest(someArrayWithData[next])
    .then(function(){
      next++;//This updates the next in the tracker function parameter
      if(next < someArrayWithData.length){//Did I finish processing all my data?
        return tracker(next);//Go to the next promise
      }
    });
  }

  return tracker(0); 
}

Upvotes: 0

Andrey
Andrey

Reputation: 121

Here goes basic example for streaming and 'p-limit'. It streams http read stream to mongo db.

const stream = require('stream');
const util = require('util');
const pLimit = require('p-limit');
const es = require('event-stream');
const streamToMongoDB = require('stream-to-mongo-db').streamToMongoDB;


const pipeline = util.promisify(stream.pipeline)

const outputDBConfig = {
    dbURL: 'yr-db-url',
    collection: 'some-collection'
};
const limit = pLimit(3);

async yrAsyncStreamingFunction(readStream) => {
        const mongoWriteStream = streamToMongoDB(outputDBConfig);
        const mapperStream = es.map((data, done) => {
                let someDataPromise = limit(() => yr_async_call_to_somewhere())

                    someDataPromise.then(
                        function handleResolve(someData) {

                            data.someData = someData;    
                            done(null, data);
                        },
                        function handleError(error) {
                            done(error)
                        }
                    );
                })

            await pipeline(
                readStream,
                JSONStream.parse('*'),
                mapperStream,
                mongoWriteStream
            );
        }

Upvotes: 1

Jingshao Chen
Jingshao Chen

Reputation: 3485

bluebird's Promise.map can take a concurrency option to control how many promises should be running in parallel. Sometimes it is easier than .all because you don't need to create the promise array.

const Promise = require('bluebird')

function getCounts() {
  return Promise.map(users, user => {
    return new Promise(resolve => {
      remoteServer.getCount(user) // makes an HTTP request
      .then(() => {
        /* snip */
        resolve();
       });
    });
  }, {concurrency: 10}); // <---- at most 10 http requests at a time
}

Upvotes: 21

Matthew Rideout
Matthew Rideout

Reputation: 8518

P-Limit

I have compared promise concurrency limitation with a custom script, bluebird, es6-promise-pool, and p-limit. I believe that p-limit has the most simple, stripped down implementation for this need. See their documentation.

Requirements

To be compatible with async in example

My Example

In this example, we need to run a function for every URL in the array (like, maybe an API request). Here this is called fetchData(). If we had an array of thousands of items to process, concurrency would definitely be useful to save on CPU and memory resources.

const pLimit = require('p-limit');

// Example Concurrency of 3 promise at once
const limit = pLimit(3);

let urls = [
    "http://www.exampleone.com/",
    "http://www.exampletwo.com/",
    "http://www.examplethree.com/",
    "http://www.examplefour.com/",
]

// Create an array of our promises using map (fetchData() returns a promise)
let promises = urls.map(url => {

    // wrap the function we are calling in the limit function we defined above
    return limit(() => fetchData(url));
});

(async () => {
    // Only three promises are run at once (as defined above)
    const result = await Promise.all(promises);
    console.log(result);
})();

The console log result is an array of your resolved promises response data.

Upvotes: 170

Related Questions