Reputation: 172
I followed all environment variable and installation instructions of Spark. Now when I run pyspark
, I get following error:
pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':"
I have already added the PATH, HADOOP_HOME, SPARK_HOME, along with winutil.exe file. Also tried one of the solutions posted on web for above error saying to change permissions like this
C:\winutils\bin>winutils.exe chmod 777 \tmp\hive
Nothing worked.
As you can see above, spark does start but nothing else works. See below when I enter following command:
What am I missing here?
Upvotes: 1
Views: 4504
Reputation: 302
(Assuming windows environment) check and set for the permission as given below.
C:\spark\spark-2.2.0-bin-hadoop2.7\bin>%HADOOP_HOME%\bin\winutils.exe ls \tmp\hive
drwx------ 1 BUILTIN\Administrators CORP\Domain Users 0 Oct 13 2017 \tmp\hive
C:\spark\spark-2.2.0-bin-hadoop2.7\bin>%HADOOP_HOME%\bin\winutils.exe chmod 777 \tmp\hive
C:\spark\spark-2.2.0-bin-hadoop2.7\bin>%HADOOP_HOME%\bin\winutils.exe ls \tmp\hive
drwxrwxrwx 1 BUILTIN\Administrators CORP\Domain Users 0 Oct 13 2017 \tmp\hive
Upvotes: 1