salvador
salvador

Reputation: 1089

pySpark: Get executor id

How can I get the executor id when running pySpark code? I am aware that with scala I can use SparkEnv.get().executorId(), but I cannot find the equivalent when using pySpark.

Upvotes: 10

Views: 3388

Answers (2)

Jack_The_Ripper
Jack_The_Ripper

Reputation: 703

The Spark UI will give you access to the executor id's as well as their individual performance metrics.

Upvotes: -1

Boaz Mohar
Boaz Mohar

Reputation: 182

You can use the REST API to query the executors, I have used it in pySparkUtils to find the executor IPs

Boaz

Upvotes: 3

Related Questions