MffnMn
MffnMn

Reputation: 420

Cancelling jobs without dataloss on DataFlow

I'm trying to find a way gracefully end my jobs, so as not to lose any data, streaming from PubSub and writing to BigQuery.

A possible approach I can envision is to have the job stop pulling new data and then run until it has processed everything, but I don't know if/how this is possible to implement.

Upvotes: 7

Views: 308

Answers (2)

MffnMn
MffnMn

Reputation: 420

It appears this feature was added in the latest release.

All you have to do now is select the drain option when cancelling a job.

Thanks.

Upvotes: 3

Eric Anderson
Eric Anderson

Reputation: 341

I believe this would be difficult (if not impossible) to do on your own. We (Google Cloud Dataflow team) are aware of this need and are working on addressing it with a new feature in the coming months.

Upvotes: 2

Related Questions