Reputation: 437
I'm currently trying to upload my dataframe into Google Big Query, but I keep getting the following error:
RequestException: HTTP request failed: Invalid JSON payload received. Unexpected token.
": 2013, "abca": NaN, "abcd
^
Is this because Google BQ cannot read 'NaN' values?
I have the following code:
sample_bucket_name = Context.default().project_id
sample_bucket_path = 'gs://' + sample_bucket_name
sample_bucket_object = sample_bucket_path + '/ABC.txt'
bigquery_dataset_name = 'ABC' bigquery_table_name = 'ABC'
# Define storage bucket
sample_bucket = storage.Bucket(sample_bucket_name)
# Create storage bucket if it does not exist
if not sample_bucket.exists():
sample_bucket.create()
# Define BigQuery dataset and table
dataset = bq.Dataset(bigquery_dataset_name)
table = bq.Table(bigquery_dataset_name + '.' +
bigquery_table_name)
# Create BigQuery dataset
if not dataset.exists():
dataset.create()
# Create or overwrite the existing table if it exists
table_schema = bq.Schema.from_data(aas_dataframe)
table.create(schema = table_schema, overwrite = True)
# Write the DataFrame to GCS (Google Cloud Storage)
%storage write --variable simple_dataframe --object
$sample_bucket_object
# Write the DataFrame to a BigQuery table
table.insert(aas_dataframe)
Upvotes: 0
Views: 2110
Reputation: 33765
From the JSON spec:
Numeric values that cannot be represented in the grammar below (such as Infinity and NaN) are not permitted.
So no, NaN
is not a valid JSON value for a number.
Upvotes: 2