Reputation: 44749
I have a HTTP sever in one program and my basic application in another one. Both of them are loops, so I have no idea how to:
How are these things usually done? I would really appriciate Python solutions because my scripts are written in Python.
Does a user make an http request which queries the app for some data and return a result? Yes
Does the app collect data and store it somewhere? The app and the HTTP Server both use SQLite database. However the DBs may be different.
Upvotes: 2
Views: 6912
Reputation: 37516
Before answering, I think we need some more information:
There are a few options depending on how you're actually using them. Sockets is an option or passing information via a file or a database.
[Edit] Based on your reply I think there's a few ways you can do it:
Some more questions:
Depending on how reliant the two parts can be, it might be best to write a new app to check the database of your app for changes (using hooks or polling or whatever) and post relevent information into the http server's own database. This has the advantage of leaving the two parts less closely coupled which is often a good thing.
I've got a webserver (Apache 2) which talks to a Django app using the fastcgi module. Have a look at the section in djangobook on fastcgi. Apache uses sockets (or regular tcp) to talk to the background app (Django).
[Edit 2] Oops - just spotted that your webserver is a python process itself. If it's all python then you could launch each in it's own thread and pass them both Queue objects which allow the two processes to send each other information in either a blocking or non-blocking manner.
Upvotes: 2
Reputation: 193311
When I write web applications in Python, I always keep my web server in the same process as my background tasks. I don't know what web server you're using, but I personally use CherryPy. Your application can have a bunch of its threads be the web server, with however many other threads you like as background tasks. This way you don't need any kind of complex IPC with sockets, named pipes, etc. Instead you simply access shared, global, synchronized data structures to pass along information, and your different modules can directly call each others functions.
EDIT: To clarify, you can use the threading module to run your CherryPy server in different threads than your other blocking servers. For example:
def listener():
sock = get_socket_from_somewhere()
while True:
client, addr = sock.accept()
# send data back to client, etc
from threading import Thread
t1 = Thread(target=listener)
t1.setDaemon(True)
t1.start()
cherrypy.quickstart() # you'd need actual arguments here
This example shows how to have a blocking server in one thread in the same process as a web server (in this case CherryPy, though it could be anything).
Upvotes: 0
Reputation: 285077
Well, you can probably just use the subprocess module. For the exchanging data, you may just be able to use the Popen.stdin and Popen.stdout streams. Of course, there's no limit to ways you /could/ do it. CORBA, DBUS, shared memory, DCOP, the list goes on. But try the simple way first, which in this case is regular python pipes/streams.
Upvotes: 3
Reputation: 11792
Depending on what you want to do you can use os.mkfifo to create a named pipe to share data between your two programs.
http://mail.python.org/pipermail/python-list/2006-August/568346.html
Upvotes: 1