Reputation: 140
I am coding a Spark job in Scala and need to send some argument through command-line in JSON file format like the application name, master and some more variables.
./bin/spark-submit --name "My app" --master local[4] --conf spark.eventLog.enabled=false --conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps" myApp.jar
I need to send app name, master and all arguments in one JSON file like:
$SPARK_HOME/bin/spark-submit --properties-file property.conf
Is that possible? How? Can anyone please explain with a simple example?
Upvotes: 4
Views: 1664
Reputation: 12794
You can use the --jars
option as follows:
$SPARK_HOME/bin/spark-submit --jars property.conf --class your.Class your.jar
The help page of spark-submit
will yell you more:
$SPARK_HOME/bin/spark-submit --help
--jars JARS Comma-separated list of local jars to include on the driver
and executor classpaths.
Despite the name, you can also use it to move around configuration files that you want to be in your driver and executors' classpath.
Upvotes: 3