Reputation: 162
I'm trying to upload a CSV file to a big query database, however i keep getting the following message.
RuntimeError: [{'reason': 'invalid', 'location': 'kid=70943:mkey=customer_encrypt_cns/file-00000000', 'message': 'Error while reading data, error message: CSV table encountered too many errors, giving up. Rows: 1; errors: 1. Please look into the error stream for more details.'}, {'reason': 'invalid', 'location': 'kid=70943:mkey=customer_encrypt_cns/file-00000000', 'message': 'Error while reading data, error message: CSV table references column position 1, but line starting at position:0 contains only 1 columns.'}]
The data in the csv matches the schema and as a test im only uploading 5 rows of 8columns e.g.
2018-02-02 x 0.1 2.4 0 0 0.88 0.4
2018-02-03 y 0.1 3 0 0 0.87 0.21
2018-02-04 z 0.1 2.8 0 0 0.86 0.21
2018-02-05 a 0.1 2.4 0 0 0.91 0.21
2018-02-06 b 0.1 1.9 0 0 1.00 0.4
Why does it say i only have 1 column?
Upvotes: 0
Views: 815
Reputation: 393
It says that you have 1 column because your data isn't formatted/parsed into a proper CSV file as there's no clear field delimiter. I've properly formatted the data to be tab-separated and was able to load it into BigQuery allowing me to obtain 5 rows of 8 columns.
Here is the data used:
2018-02-06 x 0.1 2.4 0 0 0.88 0.4
2018-02-03 y 0.1 3 0 0 0.87 0.21
2018-02-04 z 0.1 2.8 0 0 0.86 0.21
2018-02-05 a 0.1 2.4 0 0 0.91 0.21
2018-02-06 b 0.1 1.9 0 0 1.00 0.4
Upvotes: 2
Reputation: 1
Can you provide the request body ?
Looks like BigQuery cannot parse the file with the provided options. Regarding you file, you should specify "fieldDelimiter":"\t" option. You may review BigQuery API documentation for details.
BigQuery also supports schema auto-detection for comma (,), pipe(|) and tab(\t) separated fields.
Upvotes: 0