tryData
tryData

Reputation: 35

Value of 'spark.executor.instances' shown in 'Environment' page

In our Application ,We have submitted Spark job with the following configuration values:
'--num-executors' (or) 'spark.executor.instances' - Not set
'spark.executor.cores' - Not set
'spark.dynamicAllocation.enabled' -'True'
'spark.executor.memory' - 1g
(No.of worker nodes available - 3 having 4 vCores each)

In 'Environment' page of Spark Web UI, following values are observed :
'spark.executor.instances' - '3'
'spark.executor.cores' - '4'

Can we assume that the above values shown for 'spark.executor.instances' (3) and 'spark.executor.cores' (4) are the initial values only ?

The reason for this assumption is , From the 'Executors' page it can be observed that total '14' executors are used.

From the 'Event Timeline' , it can be observed that at one moment, maximum '8' executors' are running .Since total number of cores available are '12' (3 x 4) only , it looks like the number of cores used per executor also will not be constant during runtime. i.e. Initially it starts with '4' but will reduce when the number of executors increase!

enter image description here

enter image description here

enter image description here

Upvotes: 2

Views: 1962

Answers (1)

Jonathan
Jonathan

Reputation: 2043

You post cover 2 questions:

  1. Are the initial value of spark.executor.instances and spark.executor.cores 3 and 4 respectively? It depends on which mode are you using. Based on the configurations that you provide, which you set the spark.dynamicAllocation.enabled be True, and you mentioned that you have 3 nodes with 4 cores each, it will scales your number of executors based on workload. Also, If you're running your spark application on YARN, the default value of spark.executor.cores should be 1. As you didn't mention about your mode and number of spark application that you run at the same time, I assume you're only running a single spark job and you're not running in YARN mode. You can check the spark config based on the option you input: https://spark.apache.org/docs/latest/configuration.html

  2. Will the number of cores and executors differ from what you config your in spark-submit if your number of executors increase? No, once you submitted your spark application and create the sparkContext, the number of executors and cores will not change, unless you create the new one.

Upvotes: 1

Related Questions