Reputation: 3276
I'm trying to deploy a standalone version of hadoop 2.5.0. But the Datanode fails to start. Log prints:
2014-10-20 13:42:13,288 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.createDescriptor0(Ljava/lang/String;Ljava/lang/String;I)Ljava/io/FileDescriptor;
at org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.createDescriptor0(Native Method)
at org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.create(SharedFileDescriptorFactory.java:87)
at org.apache.hadoop.hdfs.server.datanode.ShortCircuitRegistry.<init>(ShortCircuitRegistry.java:165)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initDataXceiver(DataNode.java:586)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:773)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:292)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1895)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1782)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1829)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2005)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2029)
I googled for a while and couldn't find any useful help. Then I tried to compile hadoop-2.5.0 in my computer(X86-64 CentOS 6.5) since the error somehow related to the native lib, I got the same error. I also tried cdh version, still no good.
My hdfs-site.xml:
<property>
<name>fs.checkpoint.dir</name>
<value>/home/seg3/namesecondary</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/home/seg2/datanodedir</value>
</property>
<property>
<name>dfs.datanode.hdfs-blocks-metadata.enabled</name>
<value>true</value>
</property>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.support.append</name>
<value>true</value>
</property>
<property>
<name>dfs.block.local-path-access.user</name>
<value>root</value>
</property>
<property>
<name>dfs.client.read.shortcircuit</name>
<value>true</value>
</property>
<property>
<name>dfs.domain.socket.path</name>
<value>/var/run/hadoop-hdfs/dn._PORT</value>
</property>
<property>
<name>dfs.client.file-block-storage-locations.timeout</name>
<value>10000</value>
</property>
And core-site.xml:
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:8020</value>
</property>
<property>
<name>fs.trash.interval</name>
<value>10080</value>
</property>
<property>
<name>fs.trash.checkpoint.interval</name>
<value>10080</value>
</property>
<property>
<name>io.native.lib.available</name>
<value>false</value>
</property>
Any ideas? BTW, hadoop 2.3.0 works perfectly on my machine.
Upvotes: 2
Views: 2137
Reputation: 2430
Completing that Amos say, needs define:
export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_HOME}/lib/native
Upvotes: 0
Reputation: 3276
After trying to deploy the same package over a bunch of servers, I found the problem. Somehow the hadoop 2.3.0's native lib got its way into jdk's native path, which in turn poluted the java runtime. When datanode tries to load native lib, it finds the old one. After deleting those .so files, I got the datanode up and running. Cheers.
Upvotes: 1