Reputation: 172
I am currently working on an application where a client makes some call to a web services, some small amount of processing is done on the JSON data returned, and then that is stored in a database. I am currently using Requests and SQLAlchemy. The amount of processing is very small (just changing the data to a more relational format). I am not using the ORM for SA. I am just using the engine + transactions.
I was wondering what the a good pattern to do this asynchronously would be (request returned -> handed off to database -> the next request starts without waiting for the DB to finish transaction).
I know that there are a number of tools available in Python (multiprocessing, threads, coroutines, asyncore, etc). However, I am having difficulty finding a good tutorial for my use case.
I was wondering if anyone had suggestions, libraries I should look at, or async patterns that would help me solve this problem.
Thanks.
Upvotes: 1
Views: 1434
Reputation: 1764
You can push each request in a Queue and let a set of worker threads handle each one of them and push them to the DB.
Here is a simple example of the worker body:
import threading
import time
from Queue import Queue, Empty
from random import choice
class worker(threading.Thread):
def __init__(self):
threading.Thread.__init__(self)
self.q = Queue()
def run(self):
while True:
try:
r = self.q.get_nowait()
except Empty:
r = None
if r is None:
time.sleep(1.0)
continue
# do something with 'r'
print '%s: Handled request %s' % (self, r)
def push(self, r):
self.q.put(r)
workers = [worker() for i in range(5)]
for w in workers:
w.start()
Then distribute the requests to workers like this:
choice(workers).push(req)
Upvotes: 1