Reputation: 3
I am trying to implement a threaded timer to control a timeout for a serial process.
def tst_setMaxTimeFlag():
lock.acquire()
maxTimeFlag = 1
lock.release()
print "timeout!"
return
def tst_setMaxTimeTimer(maxResponseTime):
global responseTimer
lock.acquire()
maxTimeFlag = 0
lock.release()
responseTimer = threading.Timer(2,tst_setMaxTimeFlag)
print "timer set!"
responseTimer.start
print "timer start!"
return
I would imagine the output to be:
However, the tst_setMaxTimeFlag() is never called and timeout! is never printed.
If I alter responseTimer = threading.Timer(2,tst_setMaxTimeFlag)
to responseTimer = threading.Timer(2,tst_setMaxTimeFlag())
the timeout function is called immediately regardless of the time parameter.
maxTimeFlag is set as a global in main and initialized to 0.
Any thoughts?
Upvotes: 0
Views: 358
Reputation: 70582
You lost all the indentation in your code snippet, so it's hard to be sure what you did.
The most obvious problem is responseTimer.start
. That merely retrieves the start
method of your responseTimer
object. You need to call that method to start the timer; i.e., do responseTimer.start()
.
Then it will produce the output you expected, with a delay of about 2 seconds before the final "timeout!" is printed.
Upvotes: 3