Reputation: 41
I need to run SQL queries in multiple processes simultaneously. These processes can use entirely different databases, so nothing needs to be shared across processes. Within each process I create an engine connecting to a unique database and attempt to connect to that database. sqlalchemy always fails to make a database connection in at least one of the processes.
I've pared the code down to just creating an engine and establishing a connection to that engine within each process. Each engine is established within a process, not shared. The same is true for each connection. The databases are also unique, so locks should not be a problem. However, one process always fails to make a connection to its database.
import multiprocessing as mp
import sqlalchemy
def testing(iteration):
engine = sqlalchemy.create_engine('sqlite:///test_db_{}'.format(iteration),
echo = True)
print("Created engine for iteration {}".format(iteration))
with engine.connect() as conn:
print("Established database connection for iteration {}".format(iteration))
pool = mp.Pool(4)
results = pool.map(testing, range(4))
pool.close()
pool.join()
The output should be as follows (though not necessarily in this exact order):
Created engine for iteration 0
Created engine for iteration 1
Created engine for iteration 2
Created engine for iteration 3
Established database connection for iteration 0
Established database connection for iteration 1
Established database connection for iteration 2
Established database connection for iteration 3
However, I never see "Established database connection for iteration 0" and the program hangs.
Is this a general problem for sqlalchemy? Is it something specific to sqlite? Or am I missing something entirely? Thank you for your help.
Upvotes: 1
Views: 766
Reputation: 41
It turns out that the problem only occurs when using Sqlite. Switching to MySQL eliminates the problem.
Upvotes: 1