Reputation: 1054
I am trying to create approach to run multiple queries from a list at the same time, for that I am using threading library. For that I have this code:
from threading import Thread, Lock
queries = ["SELECT * FROM db1.trans", "SELECT * FROM db1.order", "SELECT * FROM db2.Store", "SELECT * FROM db2.Document", "SELECT * FROM db3.Sales"]
class DatabaseWorker(Thread):
__lock = Lock()
def __init__(self, query, result_queue):
Thread.__init__(self)
self.query = query
self.result_queue = result_queue
def run(self):
result = None
print("Connecting to database...")
try:
conn = connect(host=host, port=port)
curs = conn.cursor()
curs.execute(self.query)
result = curs
curs.close()
conn.close()
except Exception as e:
print(str(e))
self.result_queue.append(result)
delay = 1
result_queue = []
for query in queries:
worker1 = DatabaseWorker(query,result_queue)
worker1.start()
while len(result_queue) < 2:
time.sleep(delay)
job_done = True
worker1.join()
Using the above approach I am running in sequential mode. I know I can do in this way:
worker1 = DatabaseWorker(queries[0],result_queue)
worker2 = DatabaseWorker(queries[1],result_queue)
...
But I think it is not the best way. Anyone knows how can I run all the queries from the list in a dynamic mode?
Thanks!
Upvotes: 1
Views: 550
Reputation: 36
Python threading is not really parallel because of the Python GIL (Global Interpreter Lock).
For real multiprocessing parallel operations you can use the Python multiprocessing module.
example :
import multiprocessing
def runner(task):
return f'Hi, i do {task}'
if __name__ == '__main__':
list_tasks = ['1', '2', '3']
with multiprocessing.Pool() as pool:
result = pool.map(runner, list_tasks)
print(result)
Upvotes: 1