Reputation: 22282
I have a simple API exposed via tornado. Previously, one of the queries caused an rsync
to run. Through trial, error, exploration, I found that I could fork that off so that it didn't block a timely response:
tornado.process.Subprocess(['rsync', '-vazh', ... ])
I'm evolving this code now so that it no longer runs an external rsync
, but instead pokes another service. I'm using Requests to do so:
requests.post('http://other.service/foo/bar)
The network behind this service has really high latencies (same for the rysnc process), so I'd still like that to be forked off so that I don't put off a timely response. The tornado.process.Subprocess
seems well suited for calling non python shell programs to get work done. Is there an equivalent for doing so for python code like above?
Upvotes: 1
Views: 230
Reputation: 94961
tornado
has a built-in HTTP client that you can use in place of requests
: tornado.httpclient
from tornado.httpclient import AsyncHTTPClient
def handle_request(response):
if response.error:
print "Error:", response.error
else:
print response.body
http_client = AsyncHTTPClient()
http_client.fetch("http://other.service/foo/bar", method='POST', callback=handle_request)
It can also be used as a coroutine, if you want:
@coroutine
def some_method(self):
http_client = AsyncHTTPClient()
repsonse = yield http_client.fetch("http://other.service/foo/bar", method='POST')
if response.error:
print "Error:", response.error
else:
print response.body
If you're running arbitrary blocking Python code (meaning something that you can't easily find a non-blocking, tornado-compatible replacement for), you should probably consult this question, about using multiprocessing with tornado
. The short answer is that on Python 3.x you'd probably want to use concurrent.futures.ProcessPoolExecutor
to run the blocking code, since concurrent.futures.Future
will integrate properly with the tornado
event loop when you yield
from one.
Upvotes: 2