Reputation: 3
I have the following python script that reads from a file line by line and executes mysql update queries. It is extremely slow, each query seems to take over 1 second. Any idea why it is so slow?
with open(fname) as f:
for line in f:
line = line.rstrip()
email, name = line.split(':')[0], line.split(':')[-1]
try:
cursor.execute("UPDATE user SET name=%s WHERE email=%s", (name, email))
except mariadb.Error as error:
print("Error: {}".format(error))
Upvotes: 0
Views: 129
Reputation: 2670
If you wanted to be fanatical, you could also turn journaling off:
db= sqlite3.connect('database.db')
cursor = db.cursor()
cursor.execute("PRAGMA synchronous = OFF")
cursor.execute("PRAGMA journal_mode = OFF")
Upvotes: 0
Reputation: 3559
You could try using more than one thread.
import threading
with open(fname) as f:
for line in f:
line = line.rstrip()
email, name = line.split(':')[0], line.split(':')[-1]
thread = threading.Thread(target=updateUser, args=[name, email] )
thread.start()
def updateUser(name, email):
try:
cursor.execute("UPDATE user SET name=%s WHERE email=%s", (name, email))
except mariadb.Error as error:
print("Error: {}".format(error))
It could avoid the wasted time while you are querying the database. So while the query it's being processed by the database, your program will be preparing another query.
Upvotes: 0
Reputation: 1269633
You should be able to fix the performance problem by using an index:
create index idx_user_email on user(email);
1 second for an update is a long time.
Upvotes: 4