Reputation: 1745
I am trying to test a db insert operation using mocha for an node.js express application that uses fp-ts and pg npm package. I am running on my local machine and the db is in the digital ocean cloud. The tests complete successfully but the insert test seems to take 240+ ms to complete. The db table may have 12 records at most. I could use a hand at narrowing down the likely culprit.
Mocha Test db initialization
before(() => {
pool = new pg.Pool({
user: process.env.USER_NAME,
host: process.env.HOST_NAME,
database: process.env.DATABASE,
password: process.env.PASSWORD,
port: parseInt(process.env.DB_PORT!)
});
});
Mocha Test
it("It should save a User", () => {
const userRepository = makeUserRepository(pool);
return pipe(
{
emailAddress: '[email protected]',
firstName: 'John',
lastName: 'Doe',
password: 'slkfdjjfiwjfojisjsdlkfjksdlfjas'
},
userRepository.saveUser,
fold(
(error: ErrorBase) => async () => {
console.time("dbsavepipeoperationerror");
expect(error).to.be.null;
console.timeEnd("dbsavepipeoperationerror");
},
(user: User) => async () => {
console.time("dbsavepipeoperation");
tempUserId = user.userId;
expect(user).to.be.an('object');
console.timeEnd("dbsavepipeoperation");
}
)
)();
});
Repository function call that performs the save operation
const saveUser = (user: User): TaskEither<ErrorBase, User> => tryCatch(
async () => {
const paramList: string[] = [];
const values: string[] = [];
let queryString = ``;
const {emailAddress, password, firstName, lastName} = user;
if (emailAddress) {
paramList.push("emailAddress");
values.push(emailAddress);
}
if (password) {
paramList.push("password");
values.push(password);
}
if (firstName) {
paramList.push("firstName");
values.push(firstName);
}
if (lastName) {
paramList.push("lastName");
values.push(lastName);
}
queryString = `INSERT INTO "User"(` + paramList.map((param) => "\"" + param + "\"").join(",") + `)
VALUES(` + paramList.map((param, i) => "$" + (++i)).join(",") + `)
RETURNING *`;
console.time("dbsave");
const result: pg.QueryResult<User> = await pool.query(queryString, values);
console.timeEnd("dbsave");
return result?.rows[0];
},
(reason: any) => (
{message: reason.message, name: reason.name, details: '', innerError: reason as Error}
)
);
Here are the time durations for the insert and for the test to complete User Repository Tests
dbsave: 237.2822265625 ms
dbsave: 237.451ms
dbsavepipeoperation: 0.114013671875 ms
dbsavepipeoperation: 0.19ms
✔ It should save a User (240ms)
For my package.json file, I am using the following script to run the tests
"test": "ts-mocha --config src/tests/runners/mocha/.mocharc.json --recursive --in-order --watch-recursive"
Update: I added four new tests to determine if the pool establishing connections might be the culprit. The first save took 263ms and the next three, including the one I was initially trying to run took 50+ ms. Is there a way to force the pool to establish the connection before the queries need to run? I thought that was what the pool was suppose to actually do? If not, how can I force the connection upon pool creation as it seems as if the first query call is establishing it.
it("It should test the connection speed", async () => {
let queryString = `INSERT INTO profile."User"("firstName", "lastName", "emailAddress")
VALUES($1, $2, $3)
RETURNING *`;
const values = ['asdfasdfsad', 'asdfsadf', '[email protected]'];
const result: pg.QueryResult<User> = await pool.query(queryString, values);
expect(result).to.be.an('object');
});
it("It should test the connection speed - 2", async () => {
let queryString = `INSERT INTO profile."User"("firstName", "lastName", "emailAddress")
VALUES($1, $2, $3)
RETURNING *`;
const values = ['asdfasdfsad', 'asdfsadf', '[email protected]'];
const result: pg.QueryResult<User> = await pool.query(queryString, values);
expect(result).to.be.an('object');
});
it("It should test the connection speed - 3", async () => {
let queryString = `INSERT INTO profile."User"("firstName", "lastName", "emailAddress")
VALUES($1, $2, $3)
RETURNING *`;
const values = ['asdfasdfsad', 'asdfsadf', '[email protected]'];
const result: pg.QueryResult<User> = await pool.query(queryString, values);
expect(result).to.be.an('object');
});
User Repository Tests
✔ It should test the connection speed (263ms)
✔ It should test the connection speed - 2 (46ms)
✔ It should test the connection speed - 3 (44ms)
✔ It should save an object - User (56ms)
Upvotes: 0
Views: 58
Reputation: 1101
Capturing this here as an answer. As we discussed in the comments, the pg.Pool
is created with the capacity to run X simultaneous connections to the database and to help by managing the opening and closing of those connections. New connection will only be opened when one is needed - ie the pool uses "Lazy" client creation. Running a query will automatically create a client connection if one is needed or use an existing one if possible.
If you want to explicitly instantiate a connection from the pool, you can call pool.connect
as shown in this example from the docs (https://node-postgres.com/apis/pool#poolconnect):
import { Pool } from 'pg'
const pool = new Pool()
const client = await pool.connect()
await client.query('SELECT NOW()')
client.release()
As shown in the example, you should remember to return the client to the pool after you use it, by calling release()
Upvotes: 0