shantanuo
shantanuo

Reputation: 32336

Unexpected error while loading data

I am getting an "Unexpected" error. I tried a few times, and I still could not load the data. Is there any other way to load data?

gs://log_data/r_mini_raw_20120510.txt.gzto567402616005:myv.may10c
Errors:
Unexpected. Please try again.
Job ID: job_4bde60f1c13743ddabd3be2de9d6b511
Start Time: 1:48pm, 12 May 2012
End Time: 1:51pm, 12 May 2012
Destination Table: 567402616005:myvserv.may10c
Source URI: gs://log_data/r_mini_raw_20120510.txt.gz
Delimiter: ^
Max Bad Records: 30000
Schema:
zoneid: STRING
creativeid: STRING
ip: STRING

update:

I am using the file that can be found here:

http://saraswaticlasses.net/bad.csv.zip

bq load -F '^' --max_bad_record=30000 mycompany.abc bad.csv  id:STRING,ceid:STRING,ip:STRING,cb:STRING,country:STRING,telco_name:STRING,date_time:STRING,secondary:STRING,mn:STRING,sf:STRING,uuid:STRING,ua:STRING,brand:STRING,model:STRING,os:STRING,osversion:STRING,sh:STRING,sw:STRING,proxy:STRING,ah:STRING,callback:STRING

I am getting an error "BigQuery error in load operation: Unexpected. Please try again."


The same file works from Ubuntu while it does not work from CentOS 5.4 (Final) Does the OS encoding need to be checked?

Upvotes: 0

Views: 1866

Answers (2)

Jordan Tigani
Jordan Tigani

Reputation: 26637

The file you uploaded has an unterminated quote. Can you delete that line and try again? I've filed an internal bigquery bug to be able to handle this case more gracefully.

$grep '"' bad.csv
3000^0^1.202.218.8^2f1f1491^CN^others^2012-05-02 20:35:00^^^^^"Mozilla/5.0^generic web browser^^^^^^^^

When I run a load from my workstation (Ubuntu), I get a warning about the line in question. Note that if you were using a larger file, you would not see this warning, instead you'd just get a failure.

$bq show --format=prettyjson  -j job_e1d8636e225a4d5f81becf84019e7484
...
"status": {
  "errors": [
  {
    "location": "Line:29057 / Field:12", 
    "message": "Missing close double quote (\") character: field starts with: <Mozilla/>", 
    "reason": "invalid"
  }
]

Upvotes: 2

Jeremy Condit
Jeremy Condit

Reputation: 7046

My suspicion is that you have rows or fields in your input data that exceed the 64 KB limit. Perhaps re-check the formatting of your data, check that it is gzipped properly, and if all else fails, try importing uncompressed data. (One possibility is that the entire compressed file is being interpreted as a single row/field that exceeds the aforementioned limit.)

To answer your original question, there are a few other ways to import data: you could upload directly from your local machine using the command-line tool or the web UI, or you could use the raw API. However, all of these mechanisms (including the Google Storage import that you used) funnel through the same CSV parser, so it's possible that they'll all fail in the same way.

Upvotes: 1

Related Questions