user2144301
user2144301

Reputation: 31

Bigquery Error: 8822097

On trying to load a json file to bigquery. I get the following error: "An internal error occurred and the request could not be completed. Error: 8822097". Is this an error related to hitting the bigquery daily load limit? It will be amazing if someone can point me to a glossary of errors.

{Location: ""; Message: "An internal error occurred and the request could not be completed. Error: 8822097"; Reason: "internalError"

Thanks!

Upvotes: 1

Views: 468

Answers (3)

gabrielf
gabrielf

Reputation: 2269

We got the same error "An internal error occurred and the request could not be completed. Error: 8822097" when running a standard sql query. Running the corresponding legacy sql query gave us an error message that was actually actionable:

Error while reading table: ABC, error message: The reference schema differs from the existing data: The required field 'XYZ' is missing.

Fixing the underlying error, exposed by the legacy sql query, also fixed the error for the standard sql query.

In our case we have avro files. The table was created from the avro files. Newer avro files didn't contain a certain field but the table still contained that field. Rebuilding the table from the new avro files solved the issue. We also have views on top of the table which may or may not change the resulting error message.

Upvotes: 0

MonicaPC
MonicaPC

Reputation: 374

This error can occur due to the maximum columns per table — 10,000 BigQuery limit.

To verify this, you can check the number of distinct columns in the used table:

bq --format=json show project:dataset.table | jq . | grep "type" | grep -v "RECORD" | wc -l

Reducing the number of columns would probably be the best and quickest way to work-around this issue.

Upvotes: 0

az3
az3

Reputation: 3629

Are you trying to load different types of file in a single command?

It may happen when you try to load from a Google Storage path with both compressed and uncompressed files:

$ gsutil ls gs://bucket/path/
gs://bucket/path/a.txt
gs://bucket/path/b.txt.gz

$ bq load --autodetect --noreplace --source_format=NEWLINE_DELIMITED_JSON "project-id:dataset_name.table_name" gs://bucket/path/*
Waiting on bqjob_id_1 ... (0s) Current status: DONE   
BigQuery error in load operation: Error processing job 'project-id:bqjob_id_1': An internal error occurred and the request could not be completed. Error: 8822097

Upvotes: 0

Related Questions