Luiz
Luiz

Reputation: 371

Dataflow jobs failing and showing no logs

I created pipelines in Dataflow using the standard template JDBC to BigQuery and there are a few jobs that are unexpectedly failing and not showing any logs.

The thing is, when a job fails because of the resources, the job needed more vCPUs than was avaliable in the region or the memory was not enough for example, these kind of errors are displayed in the logs, as you can see below.

enter image description here

But some jobs just fail with no logs and the resources are sufficient.

enter image description here

Does anyone know how to find the logs in this case?

Upvotes: 3

Views: 1005

Answers (2)

vijaypm
vijaypm

Reputation: 67

The log viewer within Dataflow console seems to be broken. However, you can get to the error logs by going to the Logs console at https://console.cloud.google.com/logs/query;query=resource.type%3D%22dataflow_step%22;

Upvotes: 1

Israel Herraiz
Israel Herraiz

Reputation: 656

Change the severity of the logs. If you choose Default, you should see more logs. For how the job page looks like for that failed job, I would say you are probably going to need to have a look at the worker logs as well.

Depending on the error, the Diagnostics tab may have some summarized info of what kind error has made the job fail.

Upvotes: 2

Related Questions