Reputation: 47
I want to give a custom job id to the spark jobs submitted through Airflow DataprocSubmitJobOperator on Google cloud.
Through API we can do that using --id param, any idea how can we give the same through this operator?
Upvotes: 2
Views: 762
Reputation: 1
I think that you should be able to give a custom job id by specifying the task_id in the configuration of DataprocSubmitJobOperator. You can find more about it in the documentation.
Upvotes: 0
Reputation: 111
You can specify job id in the "reference" field
https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.jobs#jobreference
Upvotes: 3