Reputation: 741
I am loading onto BigQuery a series of CSV files.
All works nicely if I upload the files to Cloud Datastore first and then import the files from there using the interactive web interface.
Importing fails instead if I use bq directly from the command line, because of 'invalid timestamps':
.../processed$ ../../scripts/load_all_processed_to_bigquery.sh bfinf_horse_131125to131201.csv
Processing bfinf_horse_131125to131201.csv...
Waiting on bqjob_r5c0ad3f50e8fb78c_00000143ee3c5ccc_1 ... (50s) Current status: DONE
BigQuery error in load operation: Error processing job
'nomadic-freedom-478:bqjob_r5c0ad3f50e8fb78c_00000143ee3c5ccc_1': Too many errors encountered. Limit is: 0.
Failure details:
- File: 0 / Line:1 / Field:2: Could not parse 'SETTLED_DATE' as a
timestamp
.../processed$
The SETTLED_DATE column has no NULL / empty values and it was prepared specifically to be in BigQuery's format: YYYY-MM-DD HH:MM:SS .
Happy to provide additional information if you contact me. I have installed bq today, so I presume I am using the latest version. My OS is MacOS Mavericks.
Giacecco
Upvotes: 1
Views: 369
Reputation: 741
I've found my mistake. When using bq I forgot to specify skipping the header row using the --skip_leading_rows=1 parameter.
Giacecco
Upvotes: 2