Anthony
Anthony

Reputation: 35938

error: not found: value sqlContext on EMR

I am on EMR using Spark 2. When I ssh into the master node and run spark-shell I can't see to have access to sqlContext. Is there something I'm missing?

[hadoop@ip-172-31-13-180 ~]$ spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
16/11/10 21:07:05 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
16/11/10 21:07:14 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://172.31.13.180:4040
Spark context available as 'sc' (master = yarn, app id = application_1478720853870_0003).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
      /_/

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_111)
Type in expressions to have them evaluated.
Type :help for more information.

scala> import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.SQLContext

scala> sqlContext
<console>:25: error: not found: value sqlContext
       sqlContext
       ^

Since I'm getting same error on my local computer I've tried the following to no avail:

exported SPARK_LOCAL_IP

➜  play grep "SPARK_LOCAL_IP" ~/.zshrc
export SPARK_LOCAL_IP=127.0.0.1
➜  play source ~/.zshrc
➜  play spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
16/11/10 16:12:18 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/11/10 16:12:19 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://127.0.0.1:4040
Spark context available as 'sc' (master = local[*], app id = local-1478812339020).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.

scala> sqlContext
<console>:24: error: not found: value sqlContext
       sqlContext
       ^

scala>

My /etc/hosts contains the following

127.0.0.1       localhost
255.255.255.255 broadcasthost
::1             localhost

Upvotes: 1

Views: 2183

Answers (1)

user6022341
user6022341

Reputation:

Spark 2.0 doesn't use SQLContext anymore:

  • use SparkSession (initialized in spark-shell as spark).
  • for legacy application you can:

    val sqlContext = spark.sqlContext
    

Upvotes: 3

Related Questions