Faisal Khan
Faisal Khan

Reputation: 77

how to pass custom job id via google dataproc cluster job for spark using dataproc client

i am using the following code snippet but would not found any luck. can anyone help me to pass custom job ID

job = {
    
    "placement": {"cluster_name": cluster_name},
    "spark_job": {
        "main_class": "org.example.App",
        "jar_file_uris": [
          "gs://location.jar",
        ],
        "args": [],
    },
}


operation = job_client.submit_job_as_operation(
     request={"project_id": project_id, "region": region, "job": job}
)

Thanks in Advance :)

Upvotes: 1

Views: 550

Answers (1)

Faisal Khan
Faisal Khan

Reputation: 77

The following problem can be solved by adding reference attribute in the json like this.

"reference": {
  "job_id": "test101",
  "project_id": "1553sas207"
   }

Upvotes: 1

Related Questions