gabriel.almeida
gabriel.almeida

Reputation: 125

Python psycopg2 - Work with large data

I'm trying to transfer a large data (15 B) from one database (postgresql) to other with python/psycopg2 on docker. My docker has 4 GB of memory and is getting out of memory.

What I am doing wrong?

cursor = conn.cursor()
cursor.execute('select * from schema.table')
for row in cursor:
   tp = tuple(map(lambda x: x.encode('utf-8'), row)
   cursor.execute('Insert into table2 values {}'.format(tp))
   conn.commit()

Upvotes: 2

Views: 2038

Answers (1)

Clodoaldo Neto
Clodoaldo Neto

Reputation: 125284

Use copy_to and copy_from

f = open('t.txt', 'wb')
conn = psycopg2.connect(database='source_db') 
cursor = conn.cursor() 
cursor.copy_to(f, 'source_table')
conn.close()
f.close()

f = open('t.txt', 'r')
conn = psycopg2.connect(database='target_db') 
cursor = conn.cursor() 
cursor.copy_from(f, 'target_table')
conn.close()
f.close()

Upvotes: 1

Related Questions