Joey Blake
Joey Blake

Reputation: 3639

Google Appengine URLFetch Timeouts - Any Best Practices?

New to python and appengine. Have got a little toy i've been playing with and ran into some script timeouts last night. I know you're capped at 10 seconds. Whats best practice for dealing with this?

edit

Sorry, should have been more clear. the URLFetch Timeout is the issue I am having. By Default it is set to 5 seconds, max is 10

Traceback (most recent call last):
  File "/base/python_runtime/python_lib/versions/1/google/appengine/ext/webapp/__init__.py", line 636, in __call__
    handler.post(*groups)
  File "/base/data/home/apps/netlicense/3.349495357411133950/main.py", line 235, in post
    graph.put_wall_post(message=body, attachment=attch, profile_id=self.request.get("fbid"))
  File "/base/data/home/apps/netlicense/3.349495357411133950/facebook.py", line 149, in put_wall_post
    return self.put_object(profile_id, "feed", message=message, **attachment)
  File "/base/data/home/apps/netlicense/3.349495357411133950/facebook.py", line 131, in put_object
    return self.request(parent_object + "/" + connection_name, post_args=data)
  File "/base/data/home/apps/netlicense/3.349495357411133950/facebook.py", line 179, in request
    file = urllib2.urlopen(urlpath, post_data)
  File "/base/python_runtime/python_dist/lib/python2.5/urllib2.py", line 124, in urlopen
    return _opener.open(url, data)
  File "/base/python_runtime/python_dist/lib/python2.5/urllib2.py", line 381, in open
    response = self._open(req, data)
  File "/base/python_runtime/python_dist/lib/python2.5/urllib2.py", line 399, in _open
    '_open', req)
  File "/base/python_runtime/python_dist/lib/python2.5/urllib2.py", line 360, in _call_chain
    result = func(*args)
  File "/base/python_runtime/python_dist/lib/python2.5/urllib2.py", line 1115, in https_open
    return self.do_open(httplib.HTTPSConnection, req)
  File "/base/python_runtime/python_dist/lib/python2.5/urllib2.py", line 1080, in do_open
    r = h.getresponse()
  File "/base/python_runtime/python_dist/lib/python2.5/httplib.py", line 197, in getresponse
    self._allow_truncated, self._follow_redirects)
  File "/base/python_runtime/python_lib/versions/1/google/appengine/api/urlfetch.py", line 260, in fetch
    return rpc.get_result()
  File "/base/python_runtime/python_lib/versions/1/google/appengine/api/apiproxy_stub_map.py", line 592, in get_result
    return self.__get_result_hook(self)
  File "/base/python_runtime/python_lib/versions/1/google/appengine/api/urlfetch.py", line 361, in _get_fetch_result
    raise DeadlineExceededError(str(err))
DeadlineExceededError: ApplicationError: 5

Upvotes: 2

Views: 1388

Answers (1)

systempuntoout
systempuntoout

Reputation: 74104

You have not told us what your application does, so here are some generic suggestions:

  1. You can trap the timeout exception with this exception class google.appengine.api.urlfetch.DownloadError and gently alert the users to retry.
  2. Web request run time is 30 seconds max; if what you are trying to download is relatively small, you could probably trap the exception and resubmit (for just one time) the urlfetch inside the same Web request.
  3. If working offline is not a problem for your app, you can move the Urlfetch call to a worker task served by a Task Queue; one of the advantage of using the taskqueue API is that App Engine automatically retries the Urlfetch task until it succeeds.

Upvotes: 4

Related Questions