Michael Wong
Michael Wong

Reputation: 69

What is an efficient way to insert large amounts of data into a MySQL table using python?

I have a large database with 10+ million rows of data. I am transferring a few million rows of this data to another database on a different server using a python script. I am using cursor objects to get and insert the data. What is an efficient way to do this?

What I am doing right now:

sql2 = 'INSERT IGNORE INTO user_session(id, record_time, app_id_for_os, user_id, device_id, action) VALUES(%s, %s, %s, %s, %s, %s)'

ts3 = time.time()
for row in readCur: 
    _id = row['id']
    record_time = row['record_time']
    app_id_for_os = row['app_id_for_os']
    user_id = row['user_id']
    device_id = row['device_id']
    action = row['action']
    writeCur.execute(sql2,(_id, record_time, app_id_for_os, user_id, device_id, action))

cnx2.commit()

Upvotes: 1

Views: 1717

Answers (1)

user5224389
user5224389

Reputation: 81

As mention abouve, MySQL dump is one option. if you still want to user python, you should use cursor.executemany(operation, seq_of_params), it is more efficient way to insert a lot of values to a table.

Upvotes: 1

Related Questions