Kumar
Kumar

Reputation: 1017

API.AI to access google BigQuery public datasets

I would like to access google BigQuery public data set from python on Heroku

Example - shakespeare https://bigquery.cloud.google.com/table/bigquery-public-data:samples.shakespeare

Basically in the example below instead of yahoo weather api I want to access the shakespeare dataset

https://github.com/api-ai/apiai-weather-webhook-sample

Can I access this using only the API_KEY I generate as its public dataset or do I need OAuth authentication ?

https://support.google.com/cloud/answer/6158857?hl=en

As per the link above I can see this "If you're calling only APIs that do not require user data, such as the Google Custom Search API, then API keys might be simpler to use than OAuth 2.0 access tokens."

I tried many variations in GAE and was facing issues where the project id turns out null and hence going to try in Heroku now but wonder if I need only the API_KEY that I generate in the google cloud console or I need OAuth as they need to associate the querying to a project for billing purpose ? I already implemented the yahoo weather api example on Heroku and that works but need to replace the call to yahoo weather with a call to BigQuery public dataset.

Upvotes: 0

Views: 1085

Answers (1)

Graham Polley
Graham Polley

Reputation: 14781

If I understand your question correctly, you'd like to know how to programmatically access the BigQuery public datasets using Python, and more specifically how to authenticate.

You need to:

  1. Make sure the BigQuery API is enabled in the console.
  2. Generate a service account in the console.
  3. Download the JSON key.
  4. Export the environment GOOGLE_APPLICATION_CREDENTIALS variable to point to the key.
  5. Use the Google Python client lib to query data in BigQuery.

More info here.

Note: this is the same procedure for accessing your own datasets/tables.

Upvotes: 3

Related Questions