Reputation: 1068
I am novice in hadoop and trying to fetch data from hdfs using HDFS API in java. I am getting this error when running the program. Here is the stack trace.
Exception in thread "AWT-EventQueue-0" java.lang.NoSuchMethodError: org.apache.hadoop.tracing.SpanReceiverHost.get(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/String;)Lorg/apache/hadoop/tracing/SpanReceiverHost;
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:634)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:619)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:149)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2653)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2687)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:170)
at hdfstest1.HDFSTestGUI1.listDirectory(HDFSTestGUI1.java:663)
at hdfstest1.HDFSTestGUI1.homeBtnActionPerformed(HDFSTestGUI1.java:483)
at hdfstest1.HDFSTestGUI1.access$1200(HDFSTestGUI1.java:47)
at hdfstest1.HDFSTestGUI1$13.actionPerformed(HDFSTestGUI1.java:246)
at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:2022)
at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2348)
at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:402)
at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:259)
at javax.swing.AbstractButton.doClick(AbstractButton.java:376)
at javax.swing.AbstractButton.doClick(AbstractButton.java:356)
at hdfstest1.HDFSTestGUI1.<init>(HDFSTestGUI1.java:65)
at hdfstest1.HDFSTestGUI1$18.run(HDFSTestGUI1.java:571)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:311)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:756)
at java.awt.EventQueue.access$500(EventQueue.java:97)
at java.awt.EventQueue$3.run(EventQueue.java:709)
at java.awt.EventQueue$3.run(EventQueue.java:703)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:80)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:726)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:201)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:105)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:93)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:82)
After reading couple of articles, came to know that my current version would not be supporting some methods which I am using. My question is how to check which method is not supported by current version of hadoop and how to migrate to correct version to get this working with best way without ruining my current configurations?
I am using hadoop 1.2.1. Happy to share my code if needed. It would be appreciated if someone can help me. :)
Upvotes: 1
Views: 377
Reputation: 1068
1)As suggested by Matei Florescu in below answer, I have used try..catch to find which method is not supported by current version. Thanks Matei.
2)For 2nd problem, I just changed the jar file versions to match with my hadoop system configuration(without changing it). I followed below steps:
hadoop -version
in terminal or you can check it manually by browsing your HadoopRootDir(in my case usr/local/hadoop) where you can find jar files with version.I have removed all entries from pom.xml and added only one for hadoop-core. Here is my final pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.mycompany</groupId>
<artifactId>HDFSTest1</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.jdesktop</groupId>
<artifactId>beansbinding</artifactId>
<version>1.2.1</version>
</dependency>
</dependencies>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
</project>
Upvotes: 0
Reputation: 1195
1) "how to check which method is not supported by current version of hadoop " Programatically, you can catch the java.lang.NoSuchMethodError. So if you suspect that a method may not be implemented in the in the sofware you are using, surround the call in a try/catch and in catch do what you intend to do when this happens.
2) "how to migrate to correct version to get this working with best way without ruining my current configurations?" Just check the hadoop API documentation for different versions, one after the other, and use the version you need. I have been there, Hadoop changes a lot (maybe not so much as Spark, but still), and this is the only solution.
One piece of advice, once you find the version you need, stick to it. Future versions may come with supplementary feature implemented, but unless you really need something from a newer version, don't upgrade. (lesson learned the hard way).
Upvotes: 0