Kam
Kam

Reputation: 953

To ignore duplicate keys during 'copy from' in postgresql

I have to dump large amount of data from file to a table PostgreSQL. I know it does not support 'Ignore' 'replace' etc as done in MySql. Almost all posts regarding this in the web suggested the same thing like dumping the data to a temp table and then do a 'insert ... select ... where not exists...'.

This will not help in one case, where the file data itself contained duplicate primary keys. Any body have an idea on how to handle this in PostgreSQL?

P.S. I am doing this from a java program, if it helps

Upvotes: 82

Views: 65096

Answers (5)

Balakrishnan
Balakrishnan

Reputation: 2441

For using COPY FROM with protection against duplicates in the target table as well as in the source file (validated the results in my local instance).

This should also work in Redshift but I haven't validated it.

-- Target table
CREATE TABLE target_table
(id integer PRIMARY KEY, firstname varchar(100), lastname varchar(100));
INSERT INTO target_table (id, firstname, lastname) VALUES (14, 'albert', 'einstein');
INSERT INTO target_table (id, firstname, lastname) VALUES (4, 'isaac', 'newton');

-- COPY FROM with protection against duplicates in the target table as well as in the source file
BEGIN;
  CREATE TEMP TABLE source_file_table ON COMMIT DROP AS (
    SELECT * FROM target_table
  )
  WITH NO DATA;

  -- Simulating COPY FROM
  INSERT INTO source_file_table (id, firstname, lastname) VALUES (14, 'albert', 'einstein');
  INSERT INTO source_file_table (id, firstname, lastname) VALUES (7, 'marie', 'curie');
  INSERT INTO source_file_table (id, firstname, lastname) VALUES (7, 'marie', 'curie');
  INSERT INTO source_file_table (id, firstname, lastname) VALUES (7, 'marie', 'curie');
  INSERT INTO source_file_table (id, firstname, lastname) VALUES (5, 'Neil deGrasse', 'Tyson');

  -- for protection agains duplicate in target_table
  UPDATE source_file_table SET id=NULL
  FROM target_table WHERE source_file_table.id=target_table.id;

  INSERT INTO target_table
  SELECT * FROM source_file_table
  -- for protection agains duplicate in target_table
  WHERE source_file_table.id IS NOT NULL
  -- for protection agains duplicate in source file
  UNION
  (SELECT * FROM source_file_table
   WHERE source_file_table.id IS NOT NULL
   LIMIT 1);
COMMIT;

Upvotes: 0

Barrel Roll
Barrel Roll

Reputation: 827

PostgreSQL 9.5 now has upsert functionality. You can follow Igor's instructions, except that final INSERT includes the clause ON CONFLICT DO NOTHING.

INSERT INTO main_table
SELECT *
FROM tmp_table
ON CONFLICT DO NOTHING

Upvotes: 67

Dawn Drescher
Dawn Drescher

Reputation: 957

Igor’s answer helped me a lot, but I also ran into the problem Nate mentioned in his comment. Then I had the problem—maybe in addition to the question here—that the new data did not only contain duplicates internally but also duplicates with the existing data. What worked for me was the following.

CREATE TEMP TABLE tmp_table AS SELECT * FROM newsletter_subscribers;
COPY tmp_table (name, email) FROM stdin DELIMITER ' ' CSV;
SELECT count(*) FROM tmp_table;  -- Just to be sure
TRUNCATE newsletter_subscribers;
INSERT INTO newsletter_subscribers
    SELECT DISTINCT ON (email) * FROM tmp_table
    ORDER BY email, subscription_status;
SELECT count(*) FROM newsletter_subscribers;  -- Paranoid again

Both internal and external duplicates become the same in the tmp_table and then the DISTINCT ON (email) part removes them. The ORDER BY makes sure that the desired row comes first in the result set and DISTINCT then discards all further rows.

Upvotes: 16

Ihor Romanchenko
Ihor Romanchenko

Reputation: 28571

Use the same approach as you described, but DELETE (or group, or modify ...) duplicate PK in the temp table before loading to the main table.

Something like:

CREATE TEMP TABLE tmp_table 
ON COMMIT DROP
AS
SELECT * 
FROM main_table
WITH NO DATA;

COPY tmp_table FROM 'full/file/name/here';

INSERT INTO main_table
SELECT DISTINCT ON (PK_field) *
FROM tmp_table
ORDER BY (some_fields)

Details: CREATE TABLE AS, COPY, DISTINCT ON

Upvotes: 109

Jester
Jester

Reputation: 3327

Insert into a temp table grouped by the key so you get rid of the duplicates

and then insert if not exists

Upvotes: 0

Related Questions