Jas Kaur
Jas Kaur

Reputation: 47

Provide custom UUID to spark job through airflow DataprocSubmitJobOperator

I want to give a custom job id to the spark jobs submitted through Airflow DataprocSubmitJobOperator on Google cloud.

Through API we can do that using --id param, any idea how can we give the same through this operator?

Upvotes: 2

Views: 762

Answers (2)

SorinT
SorinT

Reputation: 1

I think that you should be able to give a custom job id by specifying the task_id in the configuration of DataprocSubmitJobOperator. You can find more about it in the documentation.

Upvotes: 0

Mikayla Konst
Mikayla Konst

Reputation: 111

You can specify job id in the "reference" field

https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.jobs#jobreference

Upvotes: 3

Related Questions