Reputation: 3064
SparkContext
has getExecutorMemoryStatus
method. But it's for memory status of an Executor.
Is there any way how to get core
status? I use Spark Standalone Cluster.
Upvotes: 4
Views: 3332
Reputation: 29155
Option 2 : Defaults :
sc.defaultParallelism
typically set to the number of worker cores in your cluster
Option 3 : Can use ExectorInfo.totalCores like below and try... it should work.
docs says
public class ExecutorInfo extends Object
Stores information about an executor to pass from the scheduler to SparkListeners.
import org.apache.spark.scheduler.{SparkListener, SparkListenerExecutorAdded}
/**
* Logs info of added executors.
*/
final class ExecutorLogger extends SparkListener {
override def onExecutorAdded(executorAdded: SparkListenerExecutorAdded): Unit =
println(s"\rExecutor ${executorAdded.executorId} added: ${executorAdded.executorInfo.executorHost} ${executorAdded.executorInfo.totalCores} cores")
}
Upvotes: 0