Mariusz Kowalewski
Mariusz Kowalewski

Reputation: 161

Exception in thread "main" java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$

Hi I try to run spark on my local laptop. I created a mvn project in intelijidea and in my main class I have one line like bellow and when I try to run a project I got the error like below

 val spark = SparkSession.builder().master("local").getOrCreate()

21/11/02 18:02:35 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up Exception in thread "main" java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x34e9fd99) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x34e9fd99 at org.apache.spark.storage.StorageUtils$.(StorageUtils.scala:213) at org.apache.spark.storage.BlockManagerMasterEndpoint.(BlockManagerMasterEndpoint.scala:110) at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:348) at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:287) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:336) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:191) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277) at org.apache.spark.SparkContext.(SparkContext.scala:460) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690) at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949) at scala.Option.getOrElse(Option.scala:201) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943) at Main$.main(Main.scala:8) at Main.main(Main.scala)

My dependency in pom

 <dependencies>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.13</artifactId>
    <version>3.2.0</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.13</artifactId>
    <version>3.2.0</version>

</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-hive_2.11</artifactId>
    <version>2.1.3</version>
    <scope>provided</scope>

</dependency>

Any idea how to resolve this problem ?

Upvotes: 15

Views: 25053

Answers (2)

Marreddy
Marreddy

Reputation: 31

Worked by running the following command from the command prompt in Windows 11. Dependent Spark jars loaded from the target path

java  --add-exports java.base/sun.nio.ch=ALL-UNNAMED  -cp "target/<JAR FILE>;target/libs/*" <FULL CLASS PATH>

java  --add-exports java.base/sun.nio.ch=ALL-UNNAMED  -cp "target/LifeTimeReports.jar;target/libs/*" TestGFS


java -version

java version "17.0.7" 2023-04-18 LTS

Java(TM) SE Runtime Environment (build 17.0.7+8-LTS-224)

The project was built in Netbeans with Maven.

Upvotes: 2

Vzzarr
Vzzarr

Reputation: 5700

At the time of writing this answer, Spark does not support Java 17 - only Java 8/11 (source: https://spark.apache.org/docs/latest/).

In my case uninstalling Java 17 and installing Java 8 (e.g. OpenJDK 8) fixed the problem and I started using Spark on my laptop.

UPDATE:

Spark runs on Java 8/11/17, Scala 2.12/2.13, Python 3.7+ and R 3.5+.

Upvotes: 12

Related Questions