Reputation: 417
I am trying to insert record to BigQuery using my Python Code. I always get Table not found error even though table exist.
from google.cloud import bigquery
from google.oauth2 import service_account
key_path = service-account.json"
credentials = service_account.Credentials.from_service_account_file(
key_path,
scopes=["https://www.googleapis.com/auth/cloud-platform"],
)
bigquery_client = bigquery.Client(
credentials=credentials,
project=credentials.project_id,
)
dataset_ref = bigquery_client.dataset('mydataset')
table_ref = dataset_ref.table('mytable3')
rows_to_insert = [(u'Adam', 32),(u'Eve', 29)]
errors = bigquery_client.insert_rows(table, rows_to_insert)
assert errors == []
ERROR :
Traceback (most recent call last):
File "./insert.py", line 24, in <module>
bigquery_client.get_table(table_ref)
File "/Users/adam/env/lib/python3.7/site-packages/google/cloud/bigquery/client.py", line 581, in get_table
api_response = self._call_api(retry, method="GET", path=table_ref.path)
File "/Users/adam/env/lib/python3.7/site-packages/google/cloud/bigquery/client.py", line 476, in _call_api
return call()
File "/Users/adam/env/lib/python3.7/site-packages/google/api_core/retry.py", line 277, in retry_wrapped_func
on_error=on_error,
File "/Users/adam/env/lib/python3.7/site-packages/google/api_core/retry.py", line 182, in retry_target
return target()
File "/Users/adam/env/lib/python3.7/site-packages/google/cloud/_http.py", line 393, in api_request
raise exceptions.from_http_response(response)
google.api_core.exceptions.NotFound: 404 GET
https://bigquery.googleapis.com/bigquery/v2/projects/myproject/datasets/mydataset/tables/mytable3: Not found: Table myproject:mydataset.mytable3
I am inserting values at Thu Nov 7 13:23:10 CET 2019 and table was created on 7 Nov 2019, 11:02:48 (after 2 hrs). Is there any reason I am getting table not found, even though the table is visible in GUI and CLI both.
Upvotes: 0
Views: 3827
Reputation: 556
From the code you are sharing, I see that the BigQuery API call table = bigquery_client.get_table(table_ref)
is missing. You can employ the following script to insert values in an already existing table named TABLE
created in the dataset DATASET
from google.cloud import bigquery
from google.oauth2 import service_account
key_path = "./service-account.json"
credentials = service_account.Credentials.from_service_account_file(
key_path,
scopes=["https://www.googleapis.com/auth/cloud-platform"],
)
def insert_to_bigquery(rows_to_insert, dataset_name="DATASET", table_name="TABLE"):
# Instantiates a client
bigquery_client = bigquery.Client()
# Prepares a reference to the dataset and table
dataset_ref = bigquery_client.dataset(dataset_name)
table_ref = dataset_ref.table(table_name)
# API call
table = bigquery_client.get_table(table_ref)
# API request to insert the rows_to_insert
errors = bigquery_client.insert_rows(table, rows_to_insert)
assert errors == []
rows_to_insert = [( u'Jason', 32),\
( u'Paula', 29),\
(u'Hellen', 55)]
insert_to_bigquery(rows_to_insert)
One can test that the insertion of values was successful by running the following bq
command:
bq query --nouse_legacy_sql 'SELECT * FROM `PROJECT.DATASET.TABLE`'
Upvotes: 1