kenyee
kenyee

Reputation: 2359

Archive ancient data from one database table to another database table in Postgresql 9.1?

Ideally, the new "WITH ROWS" feature in 9.1 would have supported crossing database boundaries, but it only appears to work in a single DB.

WITH moved_rows AS (
    DELETE FROM events
    WHERE
        "date" >= '2010-01-01' AND
        "date" < '2011-01-01'
    RETURNING *
)
INSERT INTO events_archive
SELECT * FROM moved_rows;

I wish I were able to specify events as something like "DELETE FROM LiveDB.events" and then I can do "INSERT INTO ArchiveDB.events". dblink doesn't seem to work to get the list of rows to move, and even if it did, I'm not sure if this statement is transactionally safe across DBs...

Upvotes: 3

Views: 3149

Answers (1)

tscho
tscho

Reputation: 2054

You can dump the data to a file (with SQL COPY or psql \copy) and delete it in one transaction and then import it into the other database in another transaction. To do these two steps in a single transaction you need XA (distributed transactions).

Dump from LiveDB

BEGIN TRANSACTION;
COPY (
  SELECT * FROM events
  WHERE "date" >= '2010-01-01'
  AND   "date" < '2011-01-01'
) TO '/tmp/events.csv' WITH CSV HEADER;

DELETE FROM events
WHERE "date" >= '2010-01-01'
AND   "date" < '2011-01-01'
COMMIT;

Insert into ArchiveDB:

COPY events FROM '/tmp/events.csv' WITH CSV HEADER;

Hope that helps.

Upvotes: 6

Related Questions