Reputation: 1909
I am importing an excel file into a postgres database with express and node-pg
Currently I am looping through the excel rows and executing an insert for every row but I feel it's not the right way:
workbook.xlsx.readFile(excel_file).then(function () {
// get the first worksheet
var worksheet = workbook.getWorksheet(1);
// Loop through all rows
worksheet.eachRow(function (row, rowNumber) {
// Commit to DB only from line 2 and up. We want to exclude headers from excel file
if (rowNumber > 1) {
// Loop through all values and build array to pass to DB function
row.eachCell(function (cell, colNumber) {
arrSQLParams.push(cell.value)
})
// Add the user id from session to the array
arrSQLParams.push(user);
// Insert into DB
db.query(strSQL, arrSQLParams, function (err, result) {
if (err) {
console.log(err);
ret = false;
}
})
// Empty the array for new query
arrSQLParams = [];
}
})
});
Is there a better way to do this to improve performance?
Upvotes: 17
Views: 36260
Reputation: 127
You can use this package https://www.npmjs.com/package/pg-essential. It will apply a patch on node-postgres and You just need to call it's executeBulkInsertion function. You can create an array of the objects to be inserted and pass it to the executeBulkInsertion function.
let bulkData = [];
foreach( user in users){
bulkData.push(user);
}
await db.executeBulkInsertion(bulkData,[array of column names],[table name]);
Upvotes: 0
Reputation: 25940
Following the clarification provided by the author, to insert up to 1000 records at a time, the solution as suggested within Multi-row insert with pg-promise is exactly what the author needs, in terms of both performance and flexibility.
UPDATE
A must-read article: Data Imports.
Upvotes: 6