Reputation: 2618
Both Spark and Hive are working fine individually but when I try to write the output of a Spark Dataframe to a Hive table, I am getting the below error :
Caused by: org.apache.hadoop.hive.metastore.api.MetaException: Hive Schema version 1.2.0 does not match metastore's schema version 2.1.0 Metastore is not upgraded or corrupt
The details of "schematool -dbType postgres -info" are attached below : Schematool Results Screenshot
Additional Note : From this Databricks Spark documentation page, I found out that Apache Spark supports multiple versions of hive from 0.12 upto 1.2.1 only.
So the only way for me to connect is to downgrade my Hive version ? Or is there any other provision for us to add additional jars that enables to write Spark-2.1.0 DataFrames into Hive-2.1.1 tables ?
Appreciate your opinions on this. Thanks in advance.
Upvotes: 1
Views: 1998
Reputation: 2618
Finally I myself found the answer. Will be helpful to those who stuck with the same issue.
It is resolved by adding the following parameter in the hive-site.xml file
<property>
<name>hive.metastore.schema.verification</name>
<value>false</value>
</property>
Upvotes: 3