Galat
Galat

Reputation: 477

How to run a bigquery SQL query in python jupyter notebook

I try to run SQL queries from Google BigQuery in the Jupyter notebook. I do everything as written here https://cloud.google.com/bigquery/docs/bigquery-storage-python-pandas#download_query_results_using_the_client_library. I opened a Client Account and download the JSON file. Now I try to run the script :

from google.cloud import bigquery

bqclient = bigquery.Client('c://folder/client_account.json')

# Download query results.
query_string = """
SELECT * from `project.dataset.table`
"""

dataframe = (
    bqclient.query(query_string)
    .result()
    .to_dataframe(
        # Optionally, explicitly request to use the BigQuery Storage API. As of
        # google-cloud-bigquery version 1.26.0 and above, the BigQuery Storage
        # API is used by default.
        create_bqstorage_client=True,
    )
)
print(dataframe.head())

But I keep getting an error:

DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started

I do not understand what I am doing wrong, because the JSON file looks fine and the path to the file is correct.

Upvotes: 1

Views: 3435

Answers (1)

Sakshi Gatyan
Sakshi Gatyan

Reputation: 2116

The error suggests that your GCP environment is not able to identify and configure the required application credentials.

To authenticate using service account follow the below approach :

from google.cloud import bigquery
from google.oauth2 import service_account


# TODO(developer): Set key_path to the path to the service account key
#                  file.
key_path = "path/to/service_account.json"

credentials = service_account.Credentials.from_service_account_file(
    key_path, scopes=["https://www.googleapis.com/auth/cloud-platform"],
)

bqclient = bigquery.Client(credentials=credentials, project=credentials.project_id,)

query_string = """
SELECT * from `project.dataset.table`
"""

dataframe = (
    bqclient.query(query_string)
    .result()
    .to_dataframe(
        # Optionally, explicitly request to use the BigQuery Storage API. As of
        # google-cloud-bigquery version 1.26.0 and above, the BigQuery Storage
        # API is used by default.
        create_bqstorage_client=True,
    )
)
print(dataframe.head())

Upvotes: 1

Related Questions