Daniele Foffano
Daniele Foffano

Reputation: 31

Apache-Spark error on python : java.lang.reflect.InaccessibleObjectException

it's my first time using Apache-Spark with python (pyspark), and I was trying to run Quick Start Examples, but when I run the line saying:

>>> textFile = spark.read.text("README.md")

it gives me the following error (I'm pasting just the first part because i think it's the most important):

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/daniele/Scaricati/spark/python/pyspark/sql/readwriter.py", line 311, in text
    return self._df(self._jreader.text(self._spark._sc._jvm.PythonUtils.toSeq(paths)))
  File "/home/daniele/Scaricati/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1133, in __call__
  File "/home/daniele/Scaricati/spark/python/pyspark/sql/utils.py", line 63, in deco
return f(*a, **kw)
  File "/home/daniele/Scaricati/spark/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o22.text.
: java.lang.reflect.InaccessibleObjectException: Unable to make field private transient java.lang.String java.net.URI.scheme accessible: module java.base does not "opens java.net" to unnamed module @779d0812
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:335)

Can someone help me to solve this? Sorry if my post is not that clear, but it's the first one on this forum. Thanks to everyone who will try to help, Daniele.

Upvotes: 3

Views: 6583

Answers (1)

Sohum Sachdev
Sohum Sachdev

Reputation: 1397

The issue is that your spark version and java version are incompatible. In order to resolve this you must do the following:

  1. Check you PySpark version:

    pyspark

  2. Check which Java version is required for your PySpark version (e.g. for PySpark 2.4.6 we need Java 8 - https://spark.apache.org/docs/2.4.6/)

  3. Check your available Java versions installed

    /usr/libexec/java_home -V

  4. If your Java version is not available install it (e.g. brew install adoptopenjdk8)

  5. Change your JAVA_HOME to point to the correct version. Example:

    export JAVA_HOME="/Library/Java/JavaVirtualMachines/adoptopenjdk-8.jdk/Contents/Home"

  6. Confirm version java -version

After this you should be able to perform your functions as required

textFile = spark.read.text("README.md")
textFile.show()

Upvotes: 4

Related Questions