Reputation: 125
I'm trying to transfer a large data (15 B) from one database (postgresql) to other with python/psycopg2 on docker. My docker has 4 GB of memory and is getting out of memory.
What I am doing wrong?
cursor = conn.cursor()
cursor.execute('select * from schema.table')
for row in cursor:
tp = tuple(map(lambda x: x.encode('utf-8'), row)
cursor.execute('Insert into table2 values {}'.format(tp))
conn.commit()
Upvotes: 2
Views: 2038
Reputation: 125284
f = open('t.txt', 'wb')
conn = psycopg2.connect(database='source_db')
cursor = conn.cursor()
cursor.copy_to(f, 'source_table')
conn.close()
f.close()
f = open('t.txt', 'r')
conn = psycopg2.connect(database='target_db')
cursor = conn.cursor()
cursor.copy_from(f, 'target_table')
conn.close()
f.close()
Upvotes: 1