Will
Will

Reputation: 2227

Spark 1.5.1 spark-shell throws RuntimeException

I am simply trying to launch the spark shell on my local Windows 8 and here's the error message that i get :

java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are:
 rw-rw-rw-
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
    at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
    at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162)
    at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
    at java.lang.reflect.Constructor.newInstance(Unknown Source)
    at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
    at $iwC$$iwC.<init>(<console>:9)
    at $iwC.<init>(<console>:18)
    at <init>(<console>:20)
    at .<init>(<console>:24)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)

Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-
    at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)
    at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
    ... 56 more

Somehow the REPL is here, but i can't use the sqlContext..

Did anyone faced this problem before? Any answer will be helpful, thanks.

Upvotes: 3

Views: 4284

Answers (3)

Nishu Tayal
Nishu Tayal

Reputation: 20840

First you need to download the correct compatible winutils.exe for your spark and operating system. Place it somewhere in folder followed by bin directory. Lets say D:\winutils\bin\winutils.exe

Now if /tmp/hive is present in your D: drive, run following command:

D:\winutils\bin\winutils.exe chmod 777 D:\tmp\hive

For more details, refer these posts:

Frequent Issues occurred during Spark Development

https://issues.apache.org/jira/browse/SPARK-10528

Upvotes: 2

sybergeek
sybergeek

Reputation: 31

RESOLVED : Downloaded the correct Winutils version and issue was resolved. Ideally, it should be locally compiled but if downloading compiled version make sure that it is 32/64 bit as applicable. I tried on Windows 7 64 bit, Spark 1.6 and downloaded winutils.exe from https://www.barik.net/archive/2015/01/19/172716/ and it worked..!! Complete Steps are at : http://letstalkspark.blogspot.com/2016/02/getting-started-with-spark-on-window-64.html

Upvotes: 2

Nitesh Saxena
Nitesh Saxena

Reputation: 630

This might be helpful in this case : https://issues.apache.org/jira/browse/SPARK-10528

Upvotes: 0

Related Questions