user3769827
user3769827

Reputation: 407

BigQuery Streaming data with insertAll

We are implementing Google Cloud solution. We have questions about how to insertAll?

  1. Does it has timeout in case it has to waiting for file to import?
  2. We got this error during testing our streaming code.

Traceback (most recent call last):

  File "/base/data/home/runtimes/python27/python27_lib/versions/1/google/appengine/runtime/wsgi.py", line 266, in Handle
    result = handler(dict(self._environ), self._StartResponse)
  File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 1529, in __call__
    rv = self.router.dispatch(request, response)
  File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 1278, in default_dispatcher
    return route.handler_adapter(request, response)
  File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 1102, in __call__
    return handler.dispatch()
  File "/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.5.2/webapp2.py", line 570, in dispatch
    return method(*args, **kwargs)
  File "/base/data/home/apps/s~silicon-alpha-636/mytest.378795683110553780/oauth2client/appengine.py", line 714, in check_oauth
    resp = method(request_handler, *args, **kwargs)
  File "/base/data/home/apps/s~silicon-alpha-636/mytest.378795683110553780/main.py", line 378, in get
    get_cloud_storage(self, http)
  File "/base/data/home/apps/s~silicon-alpha-636/mytest.378795683110553780/main.py", line 359, in get_cloud_storage
    jsonData = json.dumps(json_row, ensure_ascii = False, sort_keys = True, indent = 4).encode('utf-8')
  File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/json/__init__.py", line 250, in dumps
    sort_keys=sort_keys, **kw).encode(obj)
  File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/json/encoder.py", line 209, in encode
    chunks = list(chunks)
  File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/json/encoder.py", line 434, in _iterencode
    for chunk in _iterencode_dict(o, _current_indent_level):
  File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/json/encoder.py", line 408, in _iterencode_dict
    for chunk in chunks:
  File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/json/encoder.py", line 332, in _iterencode_list
    for chunk in chunks:
  File "/base/data/home/runtimes/python27/python27_dist/lib/python2.7/json/encoder.py", line 409, in _iterencode_dict
    yield chunk
DeadlineExceededError

Upvotes: 0

Views: 299

Answers (1)

Pentium10
Pentium10

Reputation: 207830

Currently, there are several errors named DeadlineExceededError for the Python runtime:

google.appengine.runtime.DeadlineExceededError: raised if the overall request times out, typically after 60 seconds, or 10 minutes for task queue requests.

google.appengine.runtime.apiproxy_errors.DeadlineExceededError: raised if an RPC exceeded its deadline. This is typically 5 seconds, but it is settable for some APIs using the 'deadline' option.

google.appengine.api.urlfetch_errors.DeadlineExceededError: raised if the URLFetch times out.

Read more at Dealing with DeadlineExceededErrors

Upvotes: 3

Related Questions