Reputation: 1089
How can I get the executor id when running pySpark code? I am aware that with scala I can use SparkEnv.get().executorId()
, but I cannot find the equivalent when using pySpark.
Upvotes: 10
Views: 3388
Reputation: 703
The Spark UI will give you access to the executor id's as well as their individual performance metrics.
Upvotes: -1
Reputation: 182
You can use the REST API to query the executors, I have used it in pySparkUtils to find the executor IPs
Boaz
Upvotes: 3