Matthias Seiler
Matthias Seiler

Reputation: 43

SparkSession initilization throws ExceptionInInitializerError

I'm trying to run a simple Spark Structured Streaming job, but I get an error when calling getOrCreate() on SparkSession...

I create the SparkSession like this:

SparkSession spark = SparkSession
                .builder()
                .appName("CountryCount")
                .master("local[*]")
                .getOrCreate();

Using this pom.xml:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">

    <artifactId>spark-streaming</artifactId>
    <version>1.0</version>
    <packaging>jar</packaging>

    <properties>
        <maven.compiler.source>11</maven.compiler.source>
        <maven.compiler.target>11</maven.compiler.target>
        <spark.version>3.0.0</spark.version>
        <mvn-shade.version>3.2.4</mvn-shade.version>
        <slf4j.version>1.7.30</slf4j.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>${mvn-shade.version}</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-log4j12 -->
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
            <version>${slf4j.version}</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.12</artifactId>
            <version>${spark.version}</version>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.12</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.12</artifactId>
            <version>${spark.version}</version>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10 -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka-0-10_2.12</artifactId>
            <version>${spark.version}</version>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>${mvn-shade.version}</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            <transformers>
                                <transformer
                                        implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
                            </transformers>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</project>

However, I get the following exception:

Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:93)
    at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370)
    at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:442)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555)
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
    at scala.Option.getOrElse(Option.scala:189)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
    at JobCountryCount.createJob(JobCountryCount.java:43)
    at JobCountryCount.<init>(JobCountryCount.java:27)
    at JobCountryCount.main(JobCountryCount.java:21)
Caused by: java.lang.NullPointerException
    at org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654)
    at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207)
    at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
    ... 14 more

Thank you in advance!

Upvotes: 3

Views: 1705

Answers (1)

zsxwing
zsxwing

Reputation: 20836

Looks like the version of Apache commons-lang library on your classpath is less than 3.8 which doesn't support JDK11. See https://issues.apache.org/jira/browse/LANG-1384 .

Since Apache Spark 3.0.0 is using 3.9, my hunch is your environment may also have an old Spark (or Hadoop) version. You can print classOf[org.apache.commons.lang3.SystemUtils].getResource("SystemUtils.class") in your codes. It will tell you where the class comes from.

Upvotes: 2

Related Questions