Reputation: 7534
I have an issue where Postgres is complaining of a duplicate ID following an import of some initial data and I am trying to see how to increment the id column counter?
Details:
I have recently uploaded some initial data into a Postgres table, where the id is set to autoincrement in my Sequelize model definition. For example:
sequelize.define('user', {
id: {
type: Sequelize.INTEGER,
primaryKey: true,
autoIncrement: true
},
name: Sequelize.STRING
}
The data insert looks like:
INSERT INTO "users" ("id","name") VALUES(1, "bob");
INSERT INTO "users" ("id","name") VALUES(2, "brooke");
INSERT INTO "users" ("id","name") VALUES(3, "nico");
Now from my node.js application when I try to insert a new record it complains that Key (id)=(1) already exists
. The SQL Sequelize is using is of the form:
INSERT INTO "users" ("id","name") VALUES(DEFAULT, "nico");
If I am empty the table of records and try this again or retry the operations enough times, then I see the counter does increment. The issue seems Postgres is not able to tell what the current max id is, based on the records?
What would be the right way to tell Postgres to update the counters, following uploading initial data into the database?
BTW using Postgres 9.6
Upvotes: 20
Views: 21707
Reputation: 3698
Root cause
Sometimes when the sequence value of a table is reduced to its maximum value in the table and if that sequence is the primary key, then this type of error occurs.
Why sequence value reduced
There are some possible causes for sequence value reduction
Resolution
To find the sequence column
SELECT pg_get_serial_sequence('table_name', 'column');
SELECT pg_get_serial_sequence('customrules', 'Id');
To get the current value of the sequence, you get from previous query
SELECT last_value FROM table_seq;
SELECT last_value FROM public."customrules_Id_seq";
To get the maximum value in the table
SELECT MAX(id) FROM table_name;
SELECT MAX("Id") FROM customrules;
If both does not match, reset the sequence
SELECT setval('table_seq', (SELECT MAX(id) FROM table_name));
SELECT setval('public."customrules_Id_seq"', (SELECT MAX("Id") FROM customrules));
Execute your insert script after these steps, you will be able to successfully ward off the mentioned error.
Upvotes: 0
Reputation: 11
I faced the same issue when I try to insert into table:
const Template = require("../models/templates");
async function addTemplate(user, body) {
try {
body.partner_id = user.partner_id;
await Template.create(body);
return true;
} catch (error) {
console.error("Error finding similar Template:", error);
return null;
}
}
module.exports = addTemplate;
The error:
Error finding similar Template: Error
at Query.run (C:\Users\rocks\Documents\Partner-Dashboard\Backend\node_modules\sequelize\lib\dialects\postgres\query.js:50:25)
at C:\Users\rocks\Documents\Partner-Dashboard\Backend\node_modules\sequelize\lib\sequelize.js:315:28
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async PostgresQueryInterface.insert (C:\Users\rocks\Documents\Partner-Dashboard\Backend\node_modules\sequelize\lib\dialects\abstract\query-interface.js:308:21)
at async model.save (C:\Users\rocks\Documents\Partner-Dashboard\Backend\node_modules\sequelize\lib\model.js:2490:35)
at async template.create (C:\Users\rocks\Documents\Partner-Dashboard\Backend\node_modules\sequelize\lib\model.js:1362:12)
at async addTemplate (C:\Users\rocks\Documents\Partner-Dashboard\Backend\query\addTemplate.js:9:20)
at async C:\Users\rocks\Documents\Partner-Dashboard\Backend\controllers\auth.js:629:24 {
name: 'SequelizeUniqueConstraintError',
errors: [
ValidationErrorItem {
message: 'id must be unique',
type: 'unique violation',
path: 'id',
value: '4',
origin: 'DB',
instance: [template],
validatorKey: 'not_unique',
validatorName: null,
validatorArgs: []
}
],
parent: error: duplicate key value violates unique constraint "templates_pkey"
at Parser.parseErrorMessage (C:\Users\rocks\Documents\Partner-Dashboard\Backend\node_modules\pg-protocol\dist\parser.js:287:98)
at Parser.handlePacket (C:\Users\rocks\Documents\Partner-Dashboard\Backend\node_modules\pg-protocol\dist\parser.js:126:29)
at Parser.parse (C:\Users\rocks\Documents\Partner-Dashboard\Backend\node_modules\pg-protocol\dist\parser.js:39:38)
at Socket.<anonymous> (C:\Users\rocks\Documents\Partner-Dashboard\Backend\node_modules\pg-protocol\dist\index.js:11:42)
at Socket.emit (node:events:514:28)
at addChunk (node:internal/streams/readable:376:12)
at readableAddChunk (node:internal/streams/readable:349:9)
at Readable.push (node:internal/streams/readable:286:10)
at TCP.onStreamRead (node:internal/stream_base_commons:190:23) {
length: 192,
severity: 'ERROR',
code: '23505',
detail: 'Key (id)=(4) already exists.',
hint: undefined,
position: undefined,
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: 'public',
table: 'templates',
column: undefined,
dataType: undefined,
constraint: 'templates_pkey',
file: 'nbtinsert.c',
line: '673',
routine: '_bt_check_unique',
sql: 'INSERT INTO "templates" ("id","partner_id","title","description","templates_html","createdAt","updatedAt") VALUES (DEFAULT,$1,$2,$3,$4,$5,$6) RETURNING "id","partner_id","title","description","templates_html","createdAt","updatedAt";',
parameters: [
1002,
'Different Title',
'Different Description',
'Different HTML Content',
'2024-01-20 06:05:48.934 +00:00',
'2024-01-20 06:05:48.934 +00:00'
]
},
original: error: duplicate key value violates unique constraint "templates_pkey"
at Parser.parseErrorMessage (C:\Users\rocks\Documents\Partner-Dashboard\Backend\node_modules\pg-protocol\dist\parser.js:287:98)
at Parser.handlePacket (C:\Users\rocks\Documents\Partner-Dashboard\Backend\node_modules\pg-protocol\dist\parser.js:126:29)
at Parser.parse (C:\Users\rocks\Documents\Partner-Dashboard\Backend\node_modules\pg-protocol\dist\parser.js:39:38)
at Socket.<anonymous> (C:\Users\rocks\Documents\Partner-Dashboard\Backend\node_modules\pg-protocol\dist\index.js:11:42)
at Socket.emit (node:events:514:28)
at addChunk (node:internal/streams/readable:376:12)
at readableAddChunk (node:internal/streams/readable:349:9)
at Readable.push (node:internal/streams/readable:286:10)
at TCP.onStreamRead (node:internal/stream_base_commons:190:23) {
length: 192,
severity: 'ERROR',
code: '23505',
detail: 'Key (id)=(4) already exists.',
hint: undefined,
position: undefined,
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: 'public',
table: 'templates',
column: undefined,
dataType: undefined,
constraint: 'templates_pkey',
file: 'nbtinsert.c',
line: '673',
routine: '_bt_check_unique',
sql: 'INSERT INTO "templates" ("id","partner_id","title","description","templates_html","createdAt","updatedAt") VALUES (DEFAULT,$1,$2,$3,$4,$5,$6) RETURNING "id","partner_id","title","description","templates_html","createdAt","updatedAt";',
parameters: [
1002,
'Different Title',
'Different Description',
'Different HTML Content',
'2024-01-20 06:05:48.934 +00:00',
'2024-01-20 06:05:48.934 +00:00'
]
},
fields: { id: '4' },
sql: 'INSERT INTO "templates" ("id","partner_id","title","description","templates_html","createdAt","updatedAt") VALUES (DEFAULT,$1,$2,$3,$4,$5,$6) RETURNING "id","partner_id","title","description","templates_html","createdAt","updatedAt";'
}
I found that the issue arises due to my seeder; I was seeding like:
{ id=1,
partner_id: 1001,
title: "Product Showcase",
description: "description 1",
},
{ id=2,
partner_id: 1002,
title: "Product Showcase",
description: "description 1",
},
I just removed the id
:
{
partner_id: 1001,
title: "Product Showcase",
description: "description 1",
},
{
partner_id: 1002,
title: "Product Showcase",
description: "description 1",
},
Now it works.
Upvotes: 1
Reputation: 989
SELECT setval('users_id_seq', (SELECT MAX(id) from users));
The name of the sequence is auto generated and is always tablename_columnname_seq.
Upvotes: 0
Reputation: 33
This happened to me because I inserted records using literal, numeric values (instead of DEFAULT or undefined) as arguments for the auto-incremented column. Doing so circumvents the column's underlying sequence object's increment call, hence making the sequence's value out of sync with the values in the column in the table.
Upvotes: 1
Reputation: 7534
After a bit more searching it turns out this will do what I need to do.
SELECT setval('users_id_seq', max(id)) FROM users;
This code will set the id
to the current maximum id in the table, here it being my users
table. Note, to check if a sequence is associated with a column, this will work:
SELECT pg_get_serial_sequence('patients', 'id')
The only thing to note is that you ignore the 'public.' part in the returned value.
I'll add the setval()
to my initial data script.
Upvotes: 36
Reputation: 141
Try dropping the table before you initially insert data, it may be persisting from a previous run in which case (1, "bob") would already be in your table before you tried adding it again.
Upvotes: 0