Reputation: 195
I am trying to upload a dataframe from python to biguquery. I get the following error.
Object of type date is not JSON serializable
Below is the sample dataframe.
product_id 01/05/19 02/05/19 03/05/19
1 187668 191568 189098
2 331527 341754 340158
3 68904 65808 65484
4 32500 38012 36816
5 82677 92106 92148
Upvotes: 1
Views: 892
Reputation: 17008
You need to convert the labels of the columns to an acceptable column name:
Column names
A column name must contain only letters (a-z, A-Z), numbers (0-9), or underscores (_), and it must start with a letter or underscore. The maximum column name length is 128 characters.
You can do so by using the table_schema
argument of to_gbq
:
df.to_gbq('db_name.table_name',
project_id='xyz',
if_exists='append',
verbose=False,
table_schema=[{'name': '_' + str(col).replace('-','_'), 'type': 'INT64'}
for col in df.columns]
)
Upvotes: 2