Reputation: 2840
I am trying to submit a spark job on Dataproc cluster. The job needs multiple system properties. I am able to pass just one as follows:
gcloud dataproc jobs submit spark \
--cluster <cluster_name> \
--class <class_name> \
--properties spark.driver.extraJavaOptions=-Dhost=127.0.0.1 \
--jars spark_job.jar
How do I pass multiple properties? I tried as follow, even this didn't work.
--properties ^#^spark.driver.extraJavaOptions=-Dhost=127.0.0.1,-Dlimit=10
Upvotes: 7
Views: 2630
Reputation: 2840
I figured it out.
gcloud dataproc jobs submit spark \
--cluster <cluster_name> \
--class <class_name> \
--properties spark.driver.extraJavaOptions='-Dhost=127.0.0.1 -Dlimit=10 -Dproperty_name=property_value' \
--jars spark_job.jar
Upvotes: 10