Eazy
Eazy

Reputation: 3492

Postgresql: Backup and restore some tables with primary keys

I have a production Postgresql database which consist of 70 tables. Some of them are very big and some are small. And I have my local Postgresql database on my local machine. I want to make some of my local database tables's content be the same as production ones. If I just backup some tables with pgAdmin on production database and then try to restore on my local machine I got constrain errors. Because for example table A has foreign key to table B and so on.

How could I copy some tables from production database and restore normally on my local machine which has already scheme and tables and without constrain errors?

P.s. I couldn't just dump all production database because some of tables are VERY BIG.

Upvotes: 7

Views: 3637

Answers (3)

LisMorski
LisMorski

Reputation: 399

  1. Re-create table structure in your local database but add DEFERRABLE INITIALLY DEFERRED options to problematic foreign key CONSTRAINTs
  2. Use pg_dump to dump your selected table data from production DB to a file and write at the very beginning: BEGIN;. At the end of the file append: UPDATE TABLE problem_no_1 SET fkey_column = NULL; for every FK column that causes problems and of course COMMIT; at the end
  3. Execute this file on your local DB

Upvotes: 0

fonini
fonini

Reputation: 3351

I'm not sure if I understood, but if you got constraint check errors you can disable the foreign key constraints, restore the tables and enable them again.

Upvotes: 0

Miljen Mikic
Miljen Mikic

Reputation: 15251

Dump complete production database, but without data in case of large tables:

$ pg_dump -t <VERY_BIG_TABLE_NAME> -s

If you want data also, avoid the -s option. Since you will have to repeat this 70 times, quicker solution is dividing tables into schemas:

$ pg_dump -n <SCHEMA_NAME_WITH_VERY_BIG_TABLES> -s
$ pg_dump -n <SCHEMA_NAME_WITH_SMALL_TABLES>

Upvotes: 1

Related Questions