Reputation: 33
When it runs 'Spark.sql' ,always shows 'WARN Hive: Failed to access metastore. This class should not accessed in runtime'
Upvotes: 2
Views: 1629
Reputation: 837
You can disable hive metastore in spark shell by setting the value of spark.sql.catalogImplementation
to in-memory which is hive by default.
spark-shell --conf spark.sql.catalogImplementation=in-memory
Upvotes: 2
Reputation: 11075
If you're using Hive
as metastore, please remember to start it by:
hive --service metastore
Upvotes: 0