Sagar Rakshe
Sagar Rakshe

Reputation: 2840

Passing multiple system properties to google dataproc cluster job

I am trying to submit a spark job on Dataproc cluster. The job needs multiple system properties. I am able to pass just one as follows:

gcloud dataproc jobs submit spark \                                   
    --cluster <cluster_name> \
    --class <class_name> \
    --properties spark.driver.extraJavaOptions=-Dhost=127.0.0.1  \
    --jars spark_job.jar

How do I pass multiple properties? I tried as follow, even this didn't work.

--properties ^#^spark.driver.extraJavaOptions=-Dhost=127.0.0.1,-Dlimit=10

Upvotes: 7

Views: 2630

Answers (1)

Sagar Rakshe
Sagar Rakshe

Reputation: 2840

I figured it out.

gcloud dataproc jobs submit spark \                                   
    --cluster <cluster_name> \
    --class <class_name> \
    --properties spark.driver.extraJavaOptions='-Dhost=127.0.0.1 -Dlimit=10 -Dproperty_name=property_value' \
    --jars spark_job.jar

Upvotes: 10

Related Questions