Reputation: 151
Scenario:
My server/application needs to handle multiple concurrent requests and for each request the server opens an ssh link to another m/c, runs a long command and sends the result back.
1 HTTP request comes → server starts 1 SSH connection → waits long time → sends back the SSH result as HTTP response
This should happen simultaneously for > 200 HTTP and SSH connections in real time.
Solution:
The server just has to do one task, that is, open an SSH connection for each HTTP request and keep the connection open. I can't even write the code in an asynchronous way as there is just one task to do: SSH. IOLoop
will get blocked for each request. Callback and deferred functions don't provide an advantage as the ssh task runs for a long time. Threading sounds the only way out with event driven technique.
I have been going through tornado examples in Python but none suit my particular need:
gevent
/eventlet
pseudo multithreadingEnvironment:
Ubuntu 12.04, high RAM & net speed
Note: I am bound to use python for coding and please be limited to my scenario only. Opening up multiple SSH links while keeping HTTP connections open sounds all asynch work but I want it to look like a synchronous activity.
Upvotes: 1
Views: 374
Reputation: 22134
If by ssh
you mean actually running the ssh process, try tornado.process.Subprocess
. If you want something more integrated into the server, twisted has an asynchronous ssh implementation. If you're using something like paramiko, threads are probably your best bet.
Upvotes: 0