Grégoire G.
Grégoire G.

Reputation: 809

Dataflow jobs fail with "Unable to bring up enough workers", quotas are OK, changing machine types and regions does not change anything

We developed an application based on Google Cloud Platform, that uses Cloud Dataflow to write data to BigQuery. I am now trying to setup this application on a new GCP project on another organization.

The problem

I am experiencing this issue:

Workflow failed. Causes: Unable to bring up enough workers: minimum 1, actual 0. Please check your quota and retry later, or please try in a different zone/region.

It happens on two dataflow templates: 1. One takes data from a Pub/Sub topic and writes to a Pub/Sub topic, 2. The other takes data from a Pub/Sub topic and writes to BigQuery.

Jobs are created from the Cloud Dataflow API. The templates are pretty standard, with 3 maximum workers and the THROUGHPUT_BASED autoscaling mode.

As suggested on similar questions, I checked the Compute engine quota, that are far from exceeded. I also changed the region, and the machine type; the problem still happens. Compute Engine and Dataflow APIs are enabled.

The question

As it works on projects on another organization, I believe that it comes from the GCP organization that have specific restrictions. Is it possible? What other points should I check to make it work?

Upvotes: 1

Views: 977

Answers (1)

Grégoire G.
Grégoire G.

Reputation: 809

After multiple tests, we managed to make it work properly.

It was indeed not a problem with regions and machine types, though most of the related Stackoverflow threads suggest that you should start with that.

It was in fact because of a restriction on external IP addresses through a GCP Organization policy. As pointed in this question, standard configuration of Dataflow requires an external IP address.

Upvotes: 1

Related Questions