user1217694
user1217694

Reputation: 11

Hcatalog hive issue

I am trying to do this hcatalog example from the following link:

http://www.cloudera.com/content/cloudera/en/documentation/cdh4/v4-2-0/CDH4-Installation-Guide/cdh4ig_topic_19_6.html

I am getting the following exception when I run the job.

Exception in thread "main" com.google.common.util.concurrent.ExecutionError: java.lang.NoClassDefFoundError: org/antlr/runtime/RecognitionException
    at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2232)

    at com.google.common.cache.LocalCache.get(LocalCache.java:3965)

    at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4764)

    at org.apache.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:167)

    at org.apache.hcatalog.common.HiveClientCache.get(HiveClientCache.java:143)

    at org.apache.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:544)

    at org.apache.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:103)

    at org.apache.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:85)

    at org.apache.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:85)

    at org.apache.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:54)

    at org.apache.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:46)

    at com.otsi.hcat.UseHCat.run(UseHCat.java:69)

    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)

    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)

    at com.otsi.hcat.UseHCat.main(UseHCat.java:96)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

    at java.lang.reflect.Method.invoke(Method.java:606)

    at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

Caused by: java.lang.NoClassDefFoundError: org/antlr/runtime/RecognitionException
    at java.lang.Class.forName0(Native Method)

    at java.lang.Class.forName(Class.java:270)

    at org.apache.hadoop.hive.metastore.MetaStoreUtils.getClass(MetaStoreUtils.java:1378)

    at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:64)

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:498)

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:476)

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:524)

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:398)

    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:357)

    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)

    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)

    at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4948)

    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)

    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:154)

    at org.apache.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:246)

    at org.apache.hcatalog.common.HiveClientCache$4.call(HiveClientCache.java:170)

    at org.apache.hcatalog.common.HiveClientCache$4.call(HiveClientCache.java:167)

    at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4767)

    at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)

    at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)

    at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)

    at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)

    ... 19 more

Caused by: java.lang.ClassNotFoundException: org.antlr.runtime.RecognitionException

Before running the MR job i executed following commands:

$ export HCAT_HOME=$HIVE_HOME/hcatalog

$ HCATJAR=$HCAT_HOME/share/hcatalog/hcatalog-core-0.11.0.jar

$ HCATPIGJAR=$HCAT_HOME/share/hcatalog/hive-hcatalog-pig-adapter-0.13.0.jar

$ export HADOOP_CLASSPATH=$HCATJAR:$HCATPIGJAR:$HIVE_HOME/lib/hive-exec- 0.13.0.jar:$HIVE_HOME/lib/hive-metastore-0.13.0.jar:$HIVE_HOME/lib/jdo-api- 3.0.1.jar:$HIVE_HOME/lib/libfb303-0.9.0.jar:$HIVE_HOME/lib/libthrift- 0.9.0.jar:$HIVE_HOME/lib/slf4j-api-1.6.4.jar: $HIVE_HOME/conf:/usr/hadoop/hadoop-2.4.0/etc/hadoop/

$ LIBJARS=echo $HADOOP_CLASSPATH | sed -e 's/:/,/g'

$ export LIBJARS=$LIBJARS,$HIVE_HOME/lib/antlr-runtime-3.4.jar

Upvotes: 1

Views: 4861

Answers (3)

Hamdi Charef
Hamdi Charef

Reputation: 649

You need to configure your environment variables in ~./bashrc

export SQOOP_HOME=/usr/lib/sqoop
export HBASE_HOME=/usr/local/Hbase
export HIVE_HOME=/usr/local/hive
export HCAT_HOME=/usr/local/hive/hcatalog

Upvotes: 0

Abhiram
Abhiram

Reputation: 362

Make sure you have the following 3 datanucleus jars available on the classpath.

datanucleus-rdbms-3.x.x.jar
datanucleus-core-3.x.x.jar
datanucleus-api-jdo-3.x.x.jar

Also it is always good to have '$HIVE_HOME/conf' on the HADOOP_CLASSPATH and CLASSPATH because it has important information about how to connect to the metastore.

Upvotes: 0

nochum
nochum

Reputation: 795

I'm not running a CDH distribution, however I was able to get this work with the following configuration settings:

export HCAT_HOME=/usr/lib/hive-hcatalog
export HIVE_HOME=/usr/lib/hive
HCATJAR=$HCAT_HOME/share/hcatalog/hive-hcatalog-core-0.13.0.2.1.1.0-385.jar
HCATPIGJAR=$HCAT_HOME/share/hcatalog/hive-hcatalog-pig-adapter-0.13.0.2.1.1.0-385.jar
HIVE_VERSION=0.13.0.2.1.1.0-385
export HADOOP_CLASSPATH=$HCATJAR:$HCATPIGJAR:$HIVE_HOME/lib/hive-exec-$HIVE_VERSION.jar:$HIVE_HOME/lib/hive-metastore-$HIVE_VERSION.jar:$HIVE_HOME/lib/libfb303-0.9.0.jar:$HIVE_HOME/lib/libthrift-0.9.0.jar:$HIVE_HOME/conf:/etc/hadoop/conf
LIBJARS=`echo $HADOOP_CLASSPATH | sed -e 's/:/,/g'`
export LIBJARS=$LIBJARS,$HIVE_HOME/lib/antlr-runtime-3.4.jar

A few things to note:

  1. The comma on the last line between "$LIBJARS" and "$HIVE_HOME" is correct.
  2. I removed references to $HIVE_HOME/lib/jdo2-api-2.3-ec.jar and $HIVE_HOME/lib/slf4j-api-1.6.4.jar since I didn't have those in my Hadoop distro. The code worked fine without it.
  3. Hadoop moves pretty quickly so jar versions change. For each referenced jar file in these settings perform an ls -l command to ensure that the jar file actually exists where you believe it should be.
  4. This code uses some deprecated API calls. My suggestion is (at least for now) not to change the code. I have found that trying to change the code to use the non-deprecated versions breaks the code (see also Radek's update to the same effect).

I hope this helps!

Upvotes: 1

Related Questions