Nikita Poberezkin
Nikita Poberezkin

Reputation: 71

How can I programmatically find Spark version in an executor node?

I'm trying to write a method (that will be ran through an executor) that will return the Spark version as a string. I know that I can find the Spark version with the following code:

SparkSession.builder().getOrCreate().version (even on executor)

But, when I'm running tests (tests in Apache Spark source code, were written before mine), some tests fail with the following error:

Caused by: java.lang.IllegalStateException: SparkSession should only be created and accessed on the driver.

So, I understand that I can't use SparkSession. Therefore my question is, is there any other way to find Spark version in an executor?

Upvotes: 0

Views: 1228

Answers (1)

Nikita Poberezkin
Nikita Poberezkin

Reputation: 71

I solved my problem by importing SPARK_VERSION directly:

import org.apache.spark.SPARK_VERSION

There was also an option to transfer version with configuration that was already in my class.

Upvotes: 2

Related Questions