Reputation: 199
I'm getting an error while running spark-shell command through cmd but unfortunately without any luck so far. I have Python/Java/Spark/Hadoop(winutils.exe)/Scala installed with versions as below:
I followed below steps and ran spark-shell (C:\Program Files\spark-3.2.0-bin-hadoop3.2\bin>
) through cmd:
JAVA_HOME
variable: C:\Program Files\Java\jdk1.8.0_311\bin
%JAVA_HOME%\bin
SPARK_HOME
variable: C:\spark-3.2.0-bin-hadoop3.2\bin
%SPARK_HOME%\bin
winutils.exe
as the following: C:\Hadoop\bin
Sure you will locate winutils.exe
inside this path.HADOOP_HOME
Variable: C:\Hadoop
%HADOOP_HOME%\bin
Am I missing out on anything? I've posted my question with error details in another thread (spark-shell command throwing this error: SparkContext: Error initializing SparkContext)
Upvotes: 2
Views: 4461
Reputation: 2091
You went the difficult way in installing everything by hand. You may need Scala too, be extremely vigilant with the version you are installing, from your example it seems like it’s Scala 2.12.
But you are right: Spark is extremely demanding in term of version matching. Java 8 is good. Java 11 is ok too, but not any more recent version.
Alternatively, you can:
Upvotes: 1