elleFlorio
elleFlorio

Reputation: 29

BigQuery "Max retries exceeded" when running dbt

When running dbt we randomly have some models failing with the following error:

HTTPSConnectionPool(host=‘bigquery.googleapis.com’, port=443): 
Max retries exceeded with url: /bigquery/v2/projects/xxxx/jobs 
(Caused by NewConnectionError(‘<urllib3.connection.HTTPSConnection object at 0x7f7fdce6dbb0>:
 Failed to establish a new connection: [Errno -3] Temporary failure in name resolution’))

I tried to search online but I could not find anything related to this error and dbt. Can this be some issue internal of dbt, or the cause is related to something external? Is there a way to prevent this?

We are running dbt targeting BigQuery using a workflow scheduler (Argo) in a GKE cluster.

Thank you! :)

Upvotes: 1

Views: 963

Answers (1)

elleFlorio
elleFlorio

Reputation: 29

In the end, the problem was the usage of preemptible nodes in GKE. We had those errors when the dbt run was executing during a restart of the kube-dns/kube-proxy service.

We "solved" the problem by applying a retry logic in Argo in case of failures.

Upvotes: 1

Related Questions