Reputation: 1156
Querying Big Query using python api sometimes returns a No response or No rows .Though retrying the same query again works fine. Whats the reason for this ?
def execute(self,bq_query,query_type,uid):
self.cache = cache.Cache()
cached_response = self.cache.get(bq_query)
if(cached_response is not None):
query_response = json.loads(zlib.decompress(cached_response))
app.logger.info("cached")
else:
big_query=_BigQuery(uid)
query_response = big_query.run_in_big_query(bq_query)
if not query_response:
logging.warning('**************************1.No reponse')
abort(404)
elif 'rows' not in query_response.keys():
logging.warning('**************************2.No rows')
abort(404)
def run_in_big_query(self, sql):
start = time.time()
queryData = {'query': sql}
queryRequest = self.service.jobs()
queryResponse = queryRequest.query(projectId=project_id,
body=queryData).execute()
app.logger.info('Query Time %f' % (time.time() - start))
return queryResponse
Upvotes: 2
Views: 1177
Reputation: 207912
The query method inserts a query job into BigQuery. By default, query method runs with 10s
for timeout. When a non-zero timeout value is specified, the job will wait for the results, and throws an exception on timeout.
Read more about
https://cloud.google.com/bigquery/querying-data
also there you have Python code, how to poll until the job is complete.
Upvotes: 2