user1888243
user1888243

Reputation: 2691

How to Enable Hive Support for spark in spark-shell (spark 2.1.1)

I am trying to enable Hive support for the spark object in spark-shell, but it doesn't work. I'm using Hortonworks HDP. The following is what I get when I try to enable Hive support:

scala> val spark3 = SparkSession.builder.enableHiveSupport.getOrCreate
17/10/24 21:30:28 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
spark3: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@22f8be7c

scala> spark3.catalog
res3: org.apache.spark.sql.catalog.Catalog = org.apache.spark.sql.internal.CatalogImpl@49c13ecd

Upvotes: 2

Views: 8164

Answers (1)

OneCricketeer
OneCricketeer

Reputation: 192013

In HDP, spark-shell already creates a valid SparkSession with Hive support.

You got it warning saying that getOrCreate used the existing session

You can try

spark.sql("show tables").show()

Also, you're using spark instead of spark3, so it's not clear what spark.catalog was going to show you other than that Object's toString info

Upvotes: 3

Related Questions