Reputation: 21
I'm writing some kind of plugin that needs to return fast to avoid timeout of the calling program, which calls it with long enough intervals to prepare the data, but the preparing of the info takes a bit longer than the allowed timeout, so I return the info from a cache file and start a thread which updates the cachefile for the next time. But the problem is, the main file can/may not exit until the thread has finished, which invalidates the whole idea. Setting the thread to daemon does not help. Daemon mode lets the program return quickly, but the thread justs gets killed off before finishing, non-daemon mode prevents the program from returning quickly until the thread has finished.
Is there a way to exit the program immediately, but still let the thread finish its business ?
#!/usr/bin/python
import time
import threading
def getinfofromcachefile():
print "This is very fast"
data = { "msg" : "old data" }
return data
def getfreshinfo():
time.sleep(5)
print "This takes a long time"
time.sleep(10)
data = { "msg" : "fresh data" }
return data
def update_cachefile():
data = getfreshinfo()
print "The data is now ready"
print data
def getinfo_fast():
data = getinfofromcachefile()
d = threading.Thread( name='update cache', target=update_cachefile )
d.setDaemon(False)
d.start()
return data
print getinfo_fast()
example output with setDaemon(False) :
user@server:/home/ubuntu# time ./snippet
This is very fast
{'msg': 'old data'}
This takes a long time
The data is now ready
{'msg': 'fresh data'}
real 0m15.022s
user 0m0.005s
sys 0m0.005s
example output with setDaemon(True) :
user@server:/home/ubuntu# time ./snippet
This is very fast
{'msg': 'old data'}
real 0m0.010s
user 0m0.000s
sys 0m0.010s
The latter returns fast, but the thread is just killed off
Upvotes: 1
Views: 113
Reputation: 21
Arthur's idea "more than threads", made me stumble onto python-daemon, which did the trick for me. My program now executes the getinfo_fast function and then immediately falls back to the prompt, but getfreshinfo() gets executed in the background.
https://pypi.python.org/pypi/python-daemon
import daemon
#...
print getinfo_fast()
with daemon.DaemonContext():
getfreshinfo()
Upvotes: 0
Reputation: 1512
It seems to me you need more than threads. If you need from your main program to be terminated quickly and run a long background task, you need to fork. I can't test right now, but I think you should try :
from multiprocessing import Process
#...
def getinfo_fast():
data = getinfofromcachefile()
p = Process(target=update_cachefile)
p.start()
# no join hence main program terminates
return data
print getinfo_fast()
Upvotes: 1