Reputation: 6327
I'm trying to setup a python server that handles POST packets. Once a packet arrives, The do_POST inits a new thread with self & some data, then, the thread does some stuff and puts into the self object received the output. this is what I have so far:
from BaseHTTPServer import BaseHTTPRequestHandler, HTTPServer
....
class httpHandler(BaseHTTPRequestHandler):
def do_POST(self):
length = int(self.headers['content-length'])
data = self.rfile.read(length)
Resolver(self,data).start()
return
Then, In my resolver class I do: import threading
class Resolver(threading.Thread):
def __init__(self,http,number):
threading.Thread.__init__(self)
self.http = http
self.number = number + "!"
def run(self):
self.http.send_response(200)
self.http.send_header('Content-type','text/html')
self.http.send_header('Content-length', len(self.number))
self.http.end_headers()
# Send the html message
self.http.wfile.write(self.number)
return
Of course, this is an example and not the complete sheet, I'm still in the phase of testing my program. It will be running over a weak platform (at the moment, Raspberry pi) and I'm looking for a good performance solution. any suggestions ?
Upvotes: 2
Views: 5835
Reputation: 366103
The problem is that BaseHTTPRequestHandler
expects you to be done with the request by the time you return from do_POST
. This isn't all that clear in the documentation, but it's immediately obvious if you look at the source to handle_one_request
, the method that calls your method:
mname = 'do_' + self.command
# ...
method = getattr(self, mname)
mname()
self.wfile.flush() #actually send the response if not already done.
If you look deeper, you'll see that, as you'd expect, the code expects to be able to close or reuse the connection as soon as it finishes handling a request.
So, you can't use BaseHTTPRequestHandler
this way.
You can, of course, write your own handler implementation instead. To a large extent, the stuff in BaseHTTPServer
is meant as sample code more than as a powerful, efficient, robust, and flexible framework (which is why the docs link straight to the source).
Alternatively, instead of trying to create a thread per request, just create a thread per connection. The ThreadingMixIn
class makes this easy.
But an even better solution would be to use a better framework, like Twisted or Tornado, or to use a webserver that does the threading for you and just calls your code via WSGI.
Upvotes: 2
Reputation: 23366
This is not the correct way to do this. Now each thread that you send a request to is just going to be writing responses through the HTTP server "simultaneously". You could add locking but that that would still defeat the purpose basically.
Python already comes with a simple built-in way to do this. BaseHTTPServer.HTTPServer
is a subclass of SocketServer.TCPServer
so you can just use SocketServer.ThreadingMixIn
. The Python docs give an example here:
http://docs.python.org/2/library/socketserver.html#asynchronous-mixins
I'm sure there already exist examples of how to do this on SO too.
Upvotes: 1