Alex
Alex

Reputation: 9418

convert postgres dump insert to batch transaction?

I am searching for a tool/script to convert a postgresql dump file of gigabytes size to a batch transaction commit file.

it should insert BEGIN and COMMIT of transactions in the dump file like so:

BEGIN;
CREATE...;
INSERT...;
INSERT...;
INSERT...;
...
COMMIT; 
BEGIN;
CREATE...;
INSERT...;
INSERT...;
INSERT...;
....

Yes. Anything would be better than insert statements. The problem is that the data consists of gigabytes of insert statements and the delivering partner is not reachable to send another dump format.

Do you know a tool to convert/accelerate the import?

Upvotes: 0

Views: 350

Answers (2)

kgrittn
kgrittn

Reputation: 19471

You should consider using the COPY command, the \copy meta-command of psql, or the pg_bulkload utility. Any of these should be significantly faster than the Ruby script you show, and have been well-tested for correct handling of line breaks and other odd characters within the data.

Upvotes: 1

Alex
Alex

Reputation: 9418

Because i couldn't find a tool i created my own script to do the job:

#!/usr/bin/ruby
i=0
File.open(ARGV[1],"a"){|t|
    File.open(ARGV[0]){|f|                                                                                        
    a=[]
    while(l=f.gets(";"))                                                                               
      if(a.length==80)
          i+=1
          puts "i: #{i}" if(i%1000==0)
          t.write("BEGIN;\n#{a.join(" ")}\nCOMMIT;\n")
          a=[]
      else
        a<<l
      end
    end
    t.write("BEGIN;\n#{a.join(" ")}\nCOMMIT;\n")
      }
  }

Usage: ruby batchtransact.rb input.sql output.sql

Upvotes: 0

Related Questions