Reputation: 3213
I am doing a lot of inserts to a mysql table in a short period of time from python code (using pymysql) that uses a lot of different threads.
Each thread, of which there are many, may or may not end up pushing data to a MySql table.
Here is the code block that causes the issue (this can be called for each thread running):
sql = ("INSERT INTO LOCATIONS (location_id, place_name) VALUES (%s, %s)")
cursor = self.connection.cursor()
cursor.execute(sql, (location_id, place_name))
cursor.close()
and it is specifically this line:
cursor.execute(sql, (location_id, place_name))
That causes this error:
pymysql.err.InterfaceError: (0, '')
Note also that i define self.connection in the init of the class the above block is in. so all threads share a self.connection object but get their own cursor object.
The error seems to happen randomly and only starts appearing (I think) after doing quite a few inserts into the mysql table. It is NOT consistent meaning it does not happen with every single attempt to insert into mysql.
I have googled this specific error and it seems like it could be from the cursor being closed before running the query. but i believe it is obvious i am closing the cursor after the query is executed.
Right now I think this is happening either because of:
Thoughts?
Upvotes: 21
Views: 30394
Reputation: 43
For pymysql its essential to call either the connection.ping() or the cursor.connection.ping() methods to be able to continue using the connection otherwise it closes. ping() re-establishes the connection with the db/cluster and depending on whether you are trying to connect to an AWS RDS cluster you might need to pass in an argument to reconnect set to a boolean True value:
This is right before consecutive execute() calls. Doesnt hurt to put before every execute to make sure not to fall into the same error.
Upvotes: 0
Reputation: 393
I get the same problem. There is a global connection in my project code, and I find that this connection will be timed out if there is no mysql operation for a long time. This error will occur when execute sql tasks, because of the timed-out connection.
My solution is: reconnecting mysql before execute sql tasks.
sql = ("INSERT INTO LOCATIONS (location_id, place_name) VALUES (%s, %s)")
self.connection.ping() # reconnecting mysql
with self.connection.cursor() as cursor:
cursor.execute(sql, (location_id, place_name))
Upvotes: 13
Reputation: 788
After upgrading to a newer PyMySQL version, I was suddenly getting the same error, but without doing a lot of queries. I was also getting an additional error:
[..]
pymysql.err.InterfaceError: (0, '')
During handling of the above exception, another exception occurred:
pymysql.err.Error: Already closed
Since this appears to be only real place where this error is being discussed, I'll post my solution here too.
Per the PyMySQL documentation, I was doing this:
connection = pymysql.connect([...])
with connection:
with connection.cursor() as cursor:
[..]
cursor.execute(sql)
# lots of other code
with connection:
[...]
What I failed to notice is that with connection
will automatically close the connection to the database when that context manager finished executing. So subsequent queries would fail, even though I was still able to get a cursor from the connection without error.
The solution was to not use the with connection
context manager, but closing the database connection manually:
connection = pymysql.connect([...])
with connection.cursor() as cursor:
[..]
cursor.execute(sql)
# lots of other code
with connection.cursor() as cursor:
[..]
connection.close()
Upvotes: 5
Reputation: 111
I've got the same error, than i found that pymysql is threadsafe, so you need to open an connection for every thread as @sometimesiwritecode said.
Source found: https://github.com/PyMySQL/PyMySQL/issues/422
Upvotes: 2
Reputation: 11
dont close the connection remove the cursor.close() line should continue update your database
Upvotes: 1
Reputation: 3213
seems like the issue was related to me having a universal connection object. creating one per thread seems to have removed this issue.
Upvotes: 21