JRR
JRR

Reputation: 6152

error when starting the spark shell

I just downloaded the latest version of spark and when I started the spark shell I got the following error:

java.net.BindException: Failed to bind to: /192.168.1.254:0: Service 'sparkDriver' failed after 16 retries!
    at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
    at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:393)
    at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:389)

...
...

java.lang.NullPointerException
    at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:193)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:71)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
    at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
    at $iwC$$iwC.<init>(<console>:9)
...
...
<console>:10: error: not found: value sqlContext
       import sqlContext.implicits._
              ^
<console>:10: error: not found: value sqlContext
       import sqlContext.sql
              ^

Is there something that I missed in setting up spark?

Upvotes: 5

Views: 10425

Answers (3)

Vijay Krishna
Vijay Krishna

Reputation: 1067

I was experiencing the same issue. First got to .bashrc and put

export SPARK_LOCAL_IP=172.30.43.105

then goto

cd $HADOOP_HOME/bin

then run the following command

hdfs dfsadmin -safemode leave

This just switches your safemode of namenode off.

Then delete the metastore_db folder from the spark home folder or /bin. It will be generally be in a folder from which you generally start a spark session.

then I ran my spark-shell using this

spark-shell --master "spark://localhost:7077"

and voila I didnot get the sqlContext.implicits._ error.

Upvotes: 1

rake
rake

Reputation: 2408

Try setting the Spark env variable SPARK_LOCAL_IP to a local IP address.

In my case, I was running Spark on an Amazon EC2 Linux instance. spark-shell stopped working, with an error message similar to yours. I was able to fix it by adding a setting like the following to the Spark config file spark-env.conf.

export SPARK_LOCAL_IP=172.30.43.105

Could also set it in ~/.profile or ~/.bashrc.

Also check host settings in /etc/hosts

Upvotes: 4

dpeacock
dpeacock

Reputation: 2757

See SPARK-8162.

It looks like it only affects 1.4.1 and 1.5.0 - you're probably best off running the latest release (1.4.0 at time of writing).

Upvotes: 1

Related Questions