Mahsa
Mahsa

Reputation: 1550

How to change the number of cores used in code?

I have a standalone program with Spark that I want to get the running time results using different number of cores. Whatever I've been trying I am getting the same runtime values!

This is the part of code that I am setting the number of cores before creating a Spark context:

System.setProperty("spark.cores.max","96")

Total number of cores that I have is 252.

Upvotes: 1

Views: 882

Answers (2)

Dici
Dici

Reputation: 25950

To complete langkilde answer's, you can use the spark.cores.max property and set it on the SparkConf (not in the system's properties...) or pass it as a parameter of spark-submit. By the way, you can also read the doc : https://spark.apache.org/docs/1.2.0/configuration.html.

Not all aspects of Spark are well documented, but configuration definitely is.

Upvotes: 0

langkilde
langkilde

Reputation: 1513

One suggestion is to try setting it using SparkConf().setMaster(local[numCores]). This for example sets it to 4 cores:

val conf = new SparkConf().setAppName("app").setMaster("local[4]")
val sc = new SparkContext(conf)

See here for details https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.SparkConf

Upvotes: 2

Related Questions