offby1
offby1

Reputation: 6993

Why am I seeing `java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible` on a mac

I've been building code at work happily for days, but suddenly one of my projects (not all) fails with this error message. See below for the answer!

Upvotes: 6

Views: 11364

Answers (2)

offby1
offby1

Reputation: 6993

How I fixed it

At first I googled, and saw lots of people with this problem were using Java 16. But I thought (incorrectly!) that I was using Java 11, because

:-) 2021-09-01T10:32:59-0700 $ java --version
openjdk 11.0.12 2021-07-20
OpenJDK Runtime Environment Temurin-11.0.12+7 (build 11.0.12+7)
OpenJDK 64-Bit Server VM Temurin-11.0.12+7 (build 11.0.12+7, mixed mode)

But I then thought to check further:

mvn --debug package
Apache Maven 3.8.1 (05c21c65bdfed0f71a2f2ada8b84da59348c4c5d)
Maven home: /usr/local/Cellar/maven/3.8.1/libexec
Java version: 16.0.1, vendor: Homebrew, runtime: /usr/local/Cellar/openjdk/16.0.1/libexec/openjdk.jdk/Contents/Home

"Aha!" said I.

So I fixed it by

  • brew uninstall maven
  • going to the maven page and following the installation instructions.

Why did it work before?

The troublesome code only recently got some tests added, which happen to trigger some reflection stuff (sorry about being vague, I'm new to Java :-) )

SparkSession.builder().appName("ANewUnitTest").master("local").getOrCreate()

Upvotes: 5

jgp
jgp

Reputation: 2091

You can try to keep mvn from brew, but specify which version of Java you want to use using the JAVA_HOME environment variable:

% export JAVA_HOME=/usr/local/opt/openjdk@8
% mvn --debug
Apache Maven 3.8.2 (ea98e05a04480131370aa0c110b8c54cf726c06f)
Maven home: /usr/local/Cellar/maven/3.8.2/libexec
Java version: 1.8.0_302, vendor: Oracle Corporation, runtime: /usr/local/Cellar/openjdk@8/1.8.0+302/libexec/openjdk.jdk/Contents/Home/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "mac os x", version: "11.5.2", arch: "x86_64", family: "mac"

You should be good with either Java v8 and v11 and Spark. I don't know yet about more recent versions, according to https://spark.apache.org/docs/latest/.

Upvotes: 2

Related Questions