Reputation: 69
I have a large database with 10+ million rows of data. I am transferring a few million rows of this data to another database on a different server using a python script. I am using cursor objects to get and insert the data. What is an efficient way to do this?
What I am doing right now:
sql2 = 'INSERT IGNORE INTO user_session(id, record_time, app_id_for_os, user_id, device_id, action) VALUES(%s, %s, %s, %s, %s, %s)'
ts3 = time.time()
for row in readCur:
_id = row['id']
record_time = row['record_time']
app_id_for_os = row['app_id_for_os']
user_id = row['user_id']
device_id = row['device_id']
action = row['action']
writeCur.execute(sql2,(_id, record_time, app_id_for_os, user_id, device_id, action))
cnx2.commit()
Upvotes: 1
Views: 1717
Reputation: 81
As mention abouve, MySQL dump is one option. if you still want to user python, you should use cursor.executemany(operation, seq_of_params), it is more efficient way to insert a lot of values to a table.
Upvotes: 1