Reputation: 243
I am new to nodeJS and promises and using Q promise library. I am facing a situation where I need to resolve array of promises and then use the result. I have used Q.all([arrayOfPromise]) to resolve all promises. Here each promise is doing DB operation so number of active DB connections are crossing the size of connection pool and getting error.
I have gone through Q library documentation - https://github.com/kriskowal/q But could not find any solution.
const process = () => {
const arrayOfIds = [id1, id2, id3, id4 .... idn]
// getById is fetching data from DB asynchronously
const arrayOfPromises = arrayOfIds.map(id => getById(id))
return Q.all([arrayOfPromises])
.then(resultArray => {
// utilization of result array
})
}
Can anyone suggest better approach to do same? Thanks in advance.
Upvotes: 0
Views: 43
Reputation: 1319
As suggested by other users, a better solution to your problem will be to use WHERE id IN ( .. )
type of query.
The problem in your case is concurrency of operations. N operations are being executed parallely, which is causing the issue. In worst case, they may even cause unexpected crashes or increased resource usage for your NodeJS process.
Q
or native Promise
doesn't have a method to control concurrency of execution. If you absolutely have to use Promise.all, I would suggest to use Bluebird.map
(Link).
const process = () => {
const arrayOfIds = [id1, id2, id3, id4 .... idn]
return Bluebird.map(arrayOfIds,
(id) => getById(id),
{ concurrency: 5 } // Default +Inifinity
)
.then(resultArray => {
// utilization of result array
})
}
Of course, you'll have to ensure that promise returned by getById
is Bluebird compatible (not sure about Q promise, but native Promise is)
Upvotes: 0
Reputation: 19987
Q
is just a util, don't see why they would handle DB specific problem.
One thing you can do is to design some queue-like mechanism to control the total size of concurrent DB ops allowed. In order word, control the size of your arrayOfPromise
.
I don't know the nature of your problem, but why don't you get data in one batched query instead of many getById
?
Upvotes: 1