digitai
digitai

Reputation: 1842

Node Promises.all() takes too long to execute

I have to insert a a table with data regarding sent emails, after each email is sent.

Inside a loop I'm stuffing an array to be solved by the Promise.all(). insertData is a function that inserts Data, given two arguments, connector, the connection pool and dataToInsert, an object with data to be inserted.

async function sendAndInsert(payload) {
  for (data of payload) {    
    let array = [];
   
    const dataToInsert = {
      id: data.id,
      campaign: data.campaign,
    }

    for (email of data) {
      array.push(insertData(connector, dataToInsert));
   }        
    await Promise.all(array);
  }

}

Afterwards, the function is invoked:

async invoke () {
  await sendAndInsert(toInsertdata);
}

To insert 5000 records, it takes about 10 minutes, which is nuts.

Using

nodejs v10

pg-txclient as DB connector to PostgreSql.

What I've done and can be discarded as possible source of error:

  1. Inserted random stuff to the table using the same connection.

I'm sure there is no issue with DB server, connection. The issue must be in the Promise.all(), await sutff.

Upvotes: 1

Views: 862

Answers (1)

Mack
Mack

Reputation: 771

It looks like each record is being inserted through a separate call to insertData. Each call is likely to include overhead such as network latency, and 5000 requests cannot all be handled simultaneously. One call to insertData has to send the data to the database and wait for a response, before the next call can even start sending its data. 5000 requests over 10 minutes corresponds to 1.2 seconds latency per request, which is not unreasonable if the database is on another machine.

A better strategy is to insert all of the objects in one network request. You should modify insertData to allow it to accept an array of objects to insert instead of just one at a time. Then, all data can be sent at once to the database and you only suffer through the latency a single time.

Upvotes: 3

Related Questions