Reputation: 21
By default I get pre-defined spark session object (spark). Which is not hive enabled. How can I get hive enabled spark session?
Upvotes: 1
Views: 1052
Reputation: 436
I know I'm late to answer this question yours. But I hope it will be useful for someone who's working on it.
If the spark-defaults file doesn't have the catalogImplemenetation property set, the default value in Toree SQL will be local meta store directory (Derby). You need to explicitly set this configuration to Hive in spark-defaults.conf file in the cluster like this:
set spark.sql.catalogImplementation=hive
Restart the kernel after saving the changes in this file.
Upvotes: 1