Sumit Purohit
Sumit Purohit

Reputation: 168

spark.driver.cores setting in spark standalone cluster mode

I am using Spark Standalone cluster mode and use "spark.driver.cores" to set number of cores for the driver. But every time the UI shows it "0" as shown in the picture enter image description here

Even setting up this value in the code

.set("spark.driver.cores","14")                                                   

does not seem to work. How should this value be set in the standalone cluster mode ?

Thanks ===UPDATE=== Here is the spark-submit command

        spark-submit --jars "file:///<path to jar>" --master spark://$MASTER:7077 --conf "spark.cores.max=330" 
    --conf "spark.executor.core=5" 
    --conf "spark.sql.broadcastTimeout=10000000" 
    --conf "spark.sql.shuffle.partitions=1000" 
    --conf "spark.default.parallelism=1000"  
    --conf "spark.executor.memory=40g"  
    --conf "spark.driver.memory=40g" 
    --conf "spark.driver.extraJavaOptions=-XX:+UseCompressedOops -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps" 
    --conf "spark.driver.maxResultSize=40g" 
--conf "spark.network.timeout=10000000" 
--class "<my class>" "<path to jar>" <other parameters>....

Each of the node in the cluster has 24 core and 64GB memory. I hope this helps. Thanks for your help.

Upvotes: 2

Views: 12371

Answers (1)

Ram Ghadiyaram
Ram Ghadiyaram

Reputation: 29155

`--conf "spark.executor.core=5"` is wrong

should be (s is missing)

--conf "spark.executor.cores=5" 

second thing is if you are not setting spark.driver.cores is spark-submit.

your .set("spark.driver.cores","14") should be like


val sparkConf = new SparkConf()
 .set("spark.driver.cores", "2")
   .setAppName(this.getClass.getSimpleName)
   .setMaster("local[*]")

   val spark: SparkSession = SparkSession.builder().config(sparkConf).appName(this.getClass.getName)
   .master("local[*]").getOrCreate()

Tip : To Verify what cofigurations you are applying : spark.sparkContext.getConf.getAll.foreach(println) will print all the configurations applied to create the spark session.

In the above example :

(spark.app.name,com.examples.DataFrameCSVExample$)
(spark.app.id,local-1558579973832)
(spark.driver.cores,2)
(spark.master,local[*])
(spark.executor.id,driver)
(spark.driver.host,192.168.19.1)
(spark.driver.port,53962)

If you are able to see here, spark-ui should show the same...

Hope you understood!!!

Upvotes: 4

Related Questions