Reputation: 51
Is there a better way to create a timeout for a function in python?
Following pebble's documentation, I wrote a function where I pass my original function as an argument:
from pebble import ProcessPool
from concurrent.futures import TimeoutError
def timeout(timeout, function, *args):
with ProcessPool() as pool:
future = pool.schedule(function, args=args, timeout=timeout)
try:
result = future.result()
return result
except TimeoutError:
raise TimeoutError
finally:
future.cancel()
It works fine, the only problem is that performance is terrible. Just to compare, If I run my original function without this "wrapping", it takes 1.5 seconds, while running it with this timeout function, takes over 8 seconds.
Any proposed solution should take into account that when timeout happens, it should end or cancel the running process and should not let it run indefinitely.
I'm running on Windows so signal is discarded.
Any suggestions?
Upvotes: 1
Views: 1064