Reputation: 5512
I've created a function that uses PyQt5 to "render" HTML and return the result. It's as follows:
def render(source_html):
"""Fully render HTML, JavaScript and all."""
import sys
from PyQt5.QtWidgets import QApplication
from PyQt5.QtWebKitWidgets import QWebPage
class Render(QWebPage):
def __init__(self, html):
self.html = None
self.app = QApplication(sys.argv)
QWebPage.__init__(self)
self.loadFinished.connect(self._loadFinished)
self.mainFrame().setHtml(html)
self.app.exec_()
def _loadFinished(self, result):
self.html = self.mainFrame().toHtml()
self.app.quit()
return Render(source_html).html
Occasionally it's threads will hang indefinitely and I'll have to kill the whole program. Unfortunately PyQt5 may as well be a black box as I'm not sure how to kill it when it misbehaves.
Ideally I'd be able to implement a timeout of n seconds. As a workaround, I've put the function in it's own script render.py
and am calling it from via subprocess
with this monstrosity:
def render(html):
"""Return fully rendered HTML, JavaScript and all."""
args = ['render.py', '-']
timeout = 20
try:
return subprocess.check_output(args,
input=html,
timeout=timeout,
universal_newlines=True)
# Python 2's subprocess.check_output doesn't support input or timeout
except TypeError:
class SubprocessError(Exception):
"""Base exception from subprocess module."""
pass
class TimeoutExpired(SubprocessError):
"""
This exception is raised when the timeout expires while
waiting for a child process.
"""
def __init__(self, cmd, timeout, output=None):
super(TimeoutExpired, self).__init__()
self.cmd = cmd
self.timeout = timeout
self.output = output
def __str__(self):
return ('Command %r timed out after %s seconds' %
(self.cmd, self.timeout))
process = subprocess.Popen(['timeout', str(timeout)] + args,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE)
# pipe html into render.py's stdin
output = process.communicate(
html.encode('utf8'))[0].decode('latin1')
retcode = process.poll()
if retcode == 124:
raise TimeoutExpired(args, timeout)
return output
The multiprocessing
module appears to greatly simplify things:
from multiprocessing import Pool
pool = Pool(1)
rendered_html = pool.apply_async(render, args=(html,)).get(timeout=20)
pool.terminate()
Is there a way to implement a timeout that doesn't necessitate these sort of shenanigans?
Upvotes: 0
Views: 1958
Reputation: 3134
I was looking for a solution too, there apparently isn't one, on purpose.
If you're using Linux and all you want is Python to attempt something for N seconds and then time out and handle an error condition after those N seconds, you can do this:
import time
import signal
# This stuff is so when we get SIGALRM from the timeout functionality we can handle it instead of
# crashing to the ground
class TimeOutError(Exception):
pass
def raise_timeout(var1, var2):
raise TimeOutError
signal.signal(signal.SIGALRM, raise_timeout)
# Turn the alarm on
signal.alarm(1)
# Try your thing
try:
time.sleep(2)
except TimeOutError as e:
print(" We hit our timeout value and we bailed out of whatever that BS was.")
# Remember to turn the alarm back off if your attempt succeeds!
signal.alarm(0)
The one drawback is that you can't nest signal.alarm() hooks; if, in your try statement, you're calling something else that also then sets a signal.alarm(), it will override the first one and screw your stuff up.
Upvotes: 1