Carp-Bezverhnii Maxim
Carp-Bezverhnii Maxim

Reputation: 71

How to solve problem related to BigQueryError "reason": "invalid", "location": "test", "debugInfo": "", "message": "no such field."

Someone worked before with streaming data into (google) BigQuery using Google Cloud Functions (insert_rows_from_dataframe())?

My problem is it seems like sometimes the table schema is not updated immediately and when you try to load some data into table immediately after creation of a new field in the schema it returns an error:

BigQueryError: [{"reason": "invalid", "location": "test", "debugInfo": "", "message": "no such field."}]"

However, if I try to load again after few seconds it all works fine, so my question if someone knows the maximum period of time in seconds for this updating (from BigQuery side) and if is possible somehow to avoid this situation?

Upvotes: 1

Views: 4510

Answers (1)

guillaume blaquiere
guillaume blaquiere

Reputation: 75715

Because the API operation on BigQuery side is not atomic, you can't avoid this case.

You can only mitigate the impact of this behavior and perform a sleep, a retries, or set a Try-catch to replay the insert_rows_from_dataframe() several times (not infinite, in case of real problem, but 5 times for example) until it pass.

Nothing is magic, if the consistency is not managed on a side, the other side has to handle it!

Upvotes: 1

Related Questions