Reputation: 485
When I create enough concurrent transactions using sequelize.transaction()
, sequelize does not progress.
For example, if I start 5 transactions with pool.max = 5, sequelize executes START TRANSACTION;
in postgres and does not execute anything further.
Test case is:
const { Sequelize } = require('sequelize');
function parseArgs(argv = process.argv) {
return argv.slice(argv.findIndex((x) => x === __filename) + 1);
}
const args = parseArgs();
const count = parseInt(args[0]) || 5;
const sequelize = new Sequelize('postgres://postgres@localhost:5432/postgres', {
seederStorage: 'sequelize',
dialect: 'postgres',
define: {
timestamps: false,
},
// logging: false,
pool: {
max: count, // keeping max pool size to equal to concurrent transactions
},
});
async function query() {
await sequelize.transaction(async (transaction) => {
// Just doing any query inside transaction. Using model yields same result.
// If no query is run inside transaction, sequelize progresses fine.
const one = await sequelize.query('SELECT 1');
console.log('Queried one', one);
});
}
async function concurrentQuery({ count = 1 }) {
const promises = [];
for (let i = 0; i < count; i++) {
promises.push(query());
}
console.log('Started promises', { count: promises.length });
await Promise.all(promises);
}
concurrentQuery({ count })
.then(() => {
console.log('result', 'pool config', sequelize.config.pool);
})
.catch((err) => {
console.error(err);
})
.finally(() => {
sequelize.close()
.then(() => {
console.log('Closed sequelize');
})
.catch((err) => {
console.error(err);
});
});
To reproduce:
$ node sequelizeTransactionPoolSize.js
And query postgres:
postgres@localhost:postgres> SELECT pid, now() - pg_stat_activity.query_start AS duration, query, state FROM pg_stat_activity WHERE (now() - pg_stat_activity.query_start) >
interval '2 seconds';
+-------+----------------+--------------------+---------------------+
| pid | duration | query | state |
|-------+----------------+--------------------+---------------------|
| 1042 | 0:00:22.191851 | START TRANSACTION; | idle in transaction |
| 1043 | 0:00:22.187831 | START TRANSACTION; | idle in transaction |
| 1044 | 0:00:22.188163 | START TRANSACTION; | idle in transaction |
| 1045 | 0:00:22.188316 | START TRANSACTION; | idle in transaction |
| 1046 | 0:00:22.180694 | START TRANSACTION; | idle in transaction |
+-------+----------------+--------------------+---------------------+
SELECT 5
Time: 0.019s
Tested with Sequelize v5 (with node 12) and v6 (with node 14).
Is this a bug? Or, am I supposed to give enough pool.max that would cover traffic spike? If expected concurrent transactions are 200 peak, set pool.max = 201, possibly overloading postgres?
Upvotes: 0
Views: 1753
Reputation: 27607
Database transactions and the number of available connections in a pool are not related in the way your question indicates. The connection pool will work regardless of your queries using transactions - it will hand them off as needed. When you start a transaction it ensures that the changes you make within the transaction are atomic and don't conflict with updates from outside the transaction. As you are not passing the transaction you create into the subsequent queries, it will hang waiting for you to commit or rollback the changes from the transaction. This is easier to read if you use async/await
.
async function query() {
// get a transaction
const transaction = await sequelize.transaction();
// pass the transaction to the query
await one = await sequelize.query('SELECT 1', { transaction });
// await transaction.commit();
console.log('Queried one', one);
}
Upvotes: 2