Reputation: 129
I am trying to run HiveTopology from storm-hive project on my local machine but connecting to remote hive running on Linux server. I have followed the instruction for creating table with ORC format and have all the settings done as per here
I am getting below error
com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2203) ~[guava-18.0.jar:na]
at com.google.common.cache.LocalCache.get(LocalCache.java:3937) ~[guava-18.0.jar:na]
at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4739) ~[guava-18.0.jar:na]
at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:227) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]
at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:202) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]
at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]
at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.getMetaStoreClient(HiveEndPoint.java:448) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]
at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:274) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]
at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:243) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]
at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:180) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]
at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:157) ~[hive-hcatalog-streaming-1.2.1.jar:1.2.1]
at org.apache.storm.hive.common.HiveWriter$5.call(HiveWriter.java:229) ~[storm-hive-0.10.0.jar:0.10.0]
at org.apache.storm.hive.common.HiveWriter$5.call(HiveWriter.java:226) ~[storm-hive-0.10.0.jar:0.10.0]
at org.apache.storm.hive.common.HiveWriter$9.call(HiveWriter.java:332) ~[storm-hive-0.10.0.jar:0.10.0]
at java.util.concurrent.FutureTask.run(Unknown Source) ~[na:1.8.0_45]
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) ~[na:1.8.0_45]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) ~[na:1.8.0_45]
at java.lang.Thread.run(Unknown Source) [na:1.8.0_45]
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523) ~[hive-metastore-1.2.1.jar:1.2.1]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) ~[hive-metastore-1.2.1.jar:1.2.1]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) ~[hive-metastore-1.2.1.jar:1.2.1]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:118) ~[hive-metastore-1.2.1.jar:1.2.1]
at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:230) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]
at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:227) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]
at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4742) ~[guava-18.0.jar:na]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527) ~[guava-18.0.jar:na]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) ~[guava-18.0.jar:na]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282) ~[guava-18.0.jar:na]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) ~[guava-18.0.jar:na]
... 17 common frames omitted
Caused by: java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_45]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) ~[na:1.8.0_45]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) ~[na:1.8.0_45]
at java.lang.reflect.Constructor.newInstance(Unknown Source) ~[na:1.8.0_45]
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) ~[hive-metastore-1.2.1.jar:1.2.1]
... 27 common frames omitted
Caused by: com.google.common.util.concurrent.UncheckedExecutionException: java.lang.NullPointerException
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2203) ~[guava-18.0.jar:na]
at com.google.common.cache.LocalCache.get(LocalCache.java:3937) ~[guava-18.0.jar:na]
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) ~[guava-18.0.jar:na]
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) ~[guava-18.0.jar:na]
at org.apache.hadoop.security.Groups.getGroups(Groups.java:182) ~[hadoop-common-2.7.1.jar:na]
at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1518) ~[hadoop-common-2.7.1.jar:na]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:436) ~[hive-metastore-1.2.1.jar:1.2.1]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236) ~[hive-metastore-1.2.1.jar:1.2.1]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:181) ~[hive-metastore-1.2.1.jar:1.2.1]
at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:330) ~[hive-hcatalog-core-1.2.1.jar:1.2.1]
... 32 common frames omitted
Please point me correct direction to resolve the issue. I am Using Hadoop : 2.7.1 Hive: 1.2.1 with transaction attributes.
Upvotes: 1
Views: 1403
Reputation: 349
I've got a same problem running storm-hive bolt localy. The root cause looks like this:
Caused by: java.lang.NullPointerException: null
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012) ~[na:1.8.0_20]
at org.apache.hadoop.util.Shell.runCommand(Shell.java:483) ~[hadoop-common-2.7.1.2.3.0.0-2557.jar:na]
at org.apache.hadoop.util.Shell.run(Shell.java:456) ~[hadoop-common-2.7.1.2.3.0.0-2557.jar:na]
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722) ~[hadoop-common-2.7.1.2.3.0.0-2557.jar:na]
at org.apache.hadoop.util.Shell.execCommand(Shell.java:815) ~[hadoop-common-2.7.1.2.3.0.0-2557.jar:na]
at org.apache.hadoop.util.Shell.execCommand(Shell.java:798) ~[hadoop-common-2.7.1.2.3.0.0-2557.jar:na]
at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:84) ~[hadoop-common-2.7.1.2.3.0.0-2557.jar:na]
at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:52) ~[hadoop-common-2.7.1.2.3.0.0-2557.jar:na]
at org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.getGroups(JniBasedUnixGroupsMappingWithFallback.java:51) ~[hadoop-common-2.7.1.2.3.0.0-2557.jar:na]
at org.apache.hadoop.security.Groups$GroupCacheLoader.fetchGroupList(Groups.java:239) ~[hadoop-common-2.7.1.2.3.0.0-2557.jar:na]
at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:220) ~[hadoop-common-2.7.1.2.3.0.0-2557.jar:na]
at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:208) ~[hadoop-common-2.7.1.2.3.0.0-2557.jar:na]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599) ~[guava-14.0.1.jar:na]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2379) ~[guava-14.0.1.jar:na]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342) ~[guava-14.0.1.jar:na]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257) ~[guava-14.0.1.jar:na]
The problem in org.apache.hadoop.util.Shell class. Method getGroupsForUserCommand return incorrect command if you run on Windows and it can't find winutils.exe.
public static String[] getGroupsForUserCommand(final String user) {
//'groups username' command return is non-consistent across different unixes
return (WINDOWS)? new String[] { WINUTILS, "groups", "-F", "\"" + user + "\""}
: new String [] {"bash", "-c", "id -gn " + user
+ "&& id -Gn " + user};}
This code will return string array with 4 elements and first element will be null
.
To solve this problem set HADOOP_HOME env variable and in dir %HADOOP_HOME%/bin must be winutils.exe (download it here).
Upvotes: 1