Krishna Reddy
Krishna Reddy

Reputation: 1099

How to enable or disable Hive support in spark-shell through Spark property (Spark 1.6)?

Is there any configuration property we can set it to disable / enable Hive support through spark-shell explicitly in spark 1.6. I tried to get all the sqlContext configuration properties with,

sqlContext.getAllConfs.foreach(println)

But, I am not sure on which property can actually required to disable/enable hive support. or Is there any other way to do this?

Upvotes: 7

Views: 35047

Answers (2)

Jacek Laskowski
Jacek Laskowski

Reputation: 74779

Spark >= 2.0

Enable and disable of Hive context is possible with config spark.sql.catalogImplementation

Possible values for spark.sql.catalogImplementation is in-memory or hive

SPARK-16013 Add option to disable HiveContext in spark-shell/pyspark


Spark < 2.0

Such a Spark property is not available in Spark 1.6.

One way to work it around is to remove Hive-related jars that would in turn disable Hive support in Spark (as Spark has Hive support when required Hive classes are available).

Upvotes: 17

Yehor Krivokon
Yehor Krivokon

Reputation: 877

You can enable hive support just by creating spark session but only in spark >=2.0:

val spark = SparkSession
  .builder()
  .appName("Spark Hive Example")
  .config("spark.sql.warehouse.dir", warehouseLocation)
  .enableHiveSupport()
  .getOrCreate()

And here you can read how to configure hive on spark by changing hive and spark properties in hive-site.xml, spark-defaults.conf: https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started - it must work with spark 1.6.1

Upvotes: 5

Related Questions