user5488148
user5488148

Reputation: 11

How to run google-cloud-datalab on my local linux server?

I have registered on Google Developers Console, but my project is not a billed project. I did the steps of “initialized environment.” and “Build and Run ”as the web pages https://github.com/GoogleCloudPlatform/datalab/wiki/Development-Environment and https://github.com/GoogleCloudPlatform/datalab/wiki/Build-and-Run describe. But when i run code in Notebook deployed on my local linux server,i run into the following error:

Create and run a SQL query

bq.Query('SELECT * FROM [cloud-datalab-samples:httplogs.logs_20140615] LIMIT 3').results()

Exception Traceback (most recent call last) in () 1 # Create and run a SQL query ----> 2 bq.Query('SELECT * FROM [cloud-datalab-samples:httplogs.logs_20140615] LIMIT 3').results()

/usr/local/lib/python2.7/dist-packages/gcp/bigquery/_query.pyc in results(self, use_cache) 130 """ 131 if not use_cache or (self._results is None): --> 132 self.execute(use_cache=use_cache) 133 return self._results.results 134

/usr/local/lib/python2.7/dist-packages/gcp/bigquery/_query.pyc in execute(self, table_name, table_mode, use_cache, priority, allow_large_results) 343 """ 344 job = self.execute_async(table_name=table_name, table_mode=table_mode, use_cache=use_cache, --> 345 priority=priority, allow_large_results=allow_large_results) 346 self._results = job.wait() 347 return self._results

/usr/local/lib/python2.7/dist-packages/gcp/bigquery/_query.pyc in execute_async(self, table_name, table_mode, use_cache, priority, allow_large_results) 307 allow_large_results=allow_large_results) 308 except Exception as e: --> 309 raise e 310 if 'jobReference' not in query_result: 311 raise Exception('Unexpected query response.')

Exception: Failed to send HTTP request. Step by step,I find the place which throws the exception: if headers is None: headers = {}

headers['user-agent'] = 'GoogleCloudDataLab/1.0'
# Add querystring to the URL if there are any arguments.
if args is not None:
  qs = urllib.urlencode(args)
  url = url + '?' + qs

# Setup method to POST if unspecified, and appropriate request headers
# if there is data to be sent within the request.
if data is not None:
  if method is None:
    method = 'POST'

  if data != '':
    # If there is a content type specified, use it (and the data) as-is.
    # Otherwise, assume JSON, and serialize the data object.
    if 'Content-Type' not in headers:
      data = json.dumps(data)
      headers['Content-Type'] = 'application/json'
  headers['Content-Length'] = str(len(data))
else:
  if method == 'POST':
    headers['Content-Length'] = '0'

# If the method is still unset, i.e. it was unspecified, and there
# was no data to be POSTed, then default to GET request.
if method is None:
  method = 'GET'

# Create an Http object to issue requests. Associate the credentials
# with it if specified to perform authorization.
#
# TODO(nikhilko):
# SSL cert validation seemingly fails, and workarounds are not amenable
# to implementing in library code. So configure the Http object to skip
# doing so, in the interim.
http = httplib2.Http()
http.disable_ssl_certificate_validation = True
if credentials is not None:
  http = credentials.authorize(http)

try:
  response, content = http.request(url,method=method,body=data,headers=headers)
  if 200 <= response.status < 300:
    if raw_response:
      return content
    return json.loads(content)
  else:
    raise RequestException(response.status, content)
except ValueError:
  raise Exception('Failed to process HTTP response.')
except httplib2.HttpLib2Error:
  raise Exception('Failed to send HTTP request.')

I wonder whether it is my configuration error or The cloud datalab does not support deploy&run locally.That is to say,we cannot run code in notebooks on local datalab server. Please give me some ideas.The question has disturbed me for one week!Thank you!

Upvotes: 1

Views: 696

Answers (3)

Anthonios Partheniou
Anthonios Partheniou

Reputation: 1709

Follow the steps in the quickstart guide titled Run Cloud Datalab locally to run datalab locally without setting up a datalab dev environment.

Upvotes: 0

Dinesh
Dinesh

Reputation: 431

If you are looking to run Datalab container locally instead of running it in Google Cloud, that is also possible as described here: https://github.com/GoogleCloudPlatform/datalab/wiki/Build-and-Run

However, that is developer setup for building/changing Datalab code and not currently geared towards a data scientist / developer looking to use Datalab as a tool. I.e. it is more complex to set up compared to just deploying Datalab to a billing-enabled cloud project. Even with locally running container, you will likely want to have a Google Cloud project to run BigQuery queries etc

Upvotes: 1

Graham Wheeler
Graham Wheeler

Reputation: 2814

If your project does not have billing enabled you cannot run queries against BigQuery, which is what it looks like you are trying to do.

Upvotes: 0

Related Questions