lordlabakdas
lordlabakdas

Reputation: 1193

Spark Shell "Failed to Initialize Compiler" Error on a mac

I just installed spark on my new machine and get the following error after installing Java, Scala and Apache-spark using homebrew. The install process is given below:

$ brew cask install java
$ brew install scala
$ brew install apache-spark

Once installed, when I try to run a basic example using spark-shell, i get the following error. Any help greatly appreciated.

$ spark-shell
 Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.

Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.
Exception in thread "main" java.lang.NullPointerException
    at scala.reflect.internal.SymbolTable.exitingPhase(SymbolTable.scala:256)
    at scala.tools.nsc.interpreter.IMain$Request.x$20$lzycompute(IMain.scala:896)
    at scala.tools.nsc.interpreter.IMain$Request.x$20(IMain.scala:895)
    at scala.tools.nsc.interpreter.IMain$Request.headerPreamble$lzycompute(IMain.scala:895)
    at scala.tools.nsc.interpreter.IMain$Request.headerPreamble(IMain.scala:895)
    at scala.tools.nsc.interpreter.IMain$Request$Wrapper.preamble(IMain.scala:918)
    at scala.tools.nsc.interpreter.IMain$CodeAssembler$$anonfun$apply$23.apply(IMain.scala:1337)
    at scala.tools.nsc.interpreter.IMain$CodeAssembler$$anonfun$apply$23.apply(IMain.scala:1336)
    at scala.tools.nsc.util.package$.stringFromWriter(package.scala:64)
    at scala.tools.nsc.interpreter.IMain$CodeAssembler$class.apply(IMain.scala:1336)
    at scala.tools.nsc.interpreter.IMain$Request$Wrapper.apply(IMain.scala:908)
    at scala.tools.nsc.interpreter.IMain$Request.compile$lzycompute(IMain.scala:1002)
    at scala.tools.nsc.interpreter.IMain$Request.compile(IMain.scala:997)
    at scala.tools.nsc.interpreter.IMain.compile(IMain.scala:579)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:567)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
    at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
    at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
    at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:98)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
    at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
    at org.apache.spark.repl.Main$.doMain(Main.scala:70)
    at org.apache.spark.repl.Main$.main(Main.scala:53)
    at org.apache.spark.repl.Main.main(Main.scala)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:564)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)`

Upvotes: 16

Views: 8617

Answers (6)

Imran K
Imran K

Reputation: 81

updating the alternatives is required for not only java $ sudo update-alternatives --config java but also javac, javap etc. $ sudo update-alternatives --config javac

So its better to remove above 8 java versions, and then install the java 8.

Upvotes: 0

Haha TTpro
Haha TTpro

Reputation: 5556

Problem: Spark incompatible with current Java version

Here is another solution that use SDKMAN

Install sdkman

curl -s "https://get.sdkman.io" | bash

Then you should close and open another Terminal.

After that, install Java 8

sdk install java 8.0.181-zulu

Now, test if it work. Go to your spark/bin then run

./spark-shell

You should not see that error again.

Upvotes: 0

Binod Suman
Binod Suman

Reputation: 61

I faced same problem. But when I checked my laptop java version, it was 9. I just changed to java 8 and found every thing working fine.

Just check this solution. Hope it will work if you getting exact same error as start of this thread.

Upvotes: 1

gxpr
gxpr

Reputation: 856

As ZackK wrote spark is incompatible with Java9, so you can check the versions of java you have in your machine and choose a compatible version, assuming you have one.

$ sudo update-alternatives --config java

Which in my case returned:

There are 2 choices for the alternative java (providing /usr/bin/java).

  • 0/usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java 1081 auto mode
  • *1/opt/java/jdk-9.0.4/bin/java 1 manual mode

The asterisk in front of 1 denotes the active version. Choosing 0 changed it to a compatible version.

$ java -version

which returned: openjdk version "1.8.0_151"

After the change spark-shell worked.

Upvotes: 4

Kras
Kras

Reputation: 1

win10: You have to convert to jdk8: Set up JAVA_HOME = jdk8; Discard from path the C:\ProgramData\Oracle\Java\javapath; (it always show the jdk9)

Upvotes: 0

ZackK
ZackK

Reputation: 446

Spark is incompatible with Java 9, which is the version brew cask install java will install if it is up to date. If you did install Java 9, what you need to do is install Java 8 instead:

brew cask uninstall java
brew tap caskroom/versions
brew cask search java
brew cask install java8

Upvotes: 31

Related Questions