Reputation: 41
I downloaded Hadoop source code to localy import Hadoop source code into the Eclipse workspace.
I added core-site.xml
and hdfs-site.xml
into
hadoop-2.7.0-src/hadoop-hdfs-project/hadoop-hdfs/src/main/java
dir content.
In Eclipse I ran NameNode.java
java NameNode.java -format
I get the following error message:
java.lang.IllegalArgumentException: URI has an authority component
at java.io.File.<init>(File.java:423)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.getStorageDirectory(NNStorage.java:329)
at org.apache.hadoop.hdfs.server.namenode.FSEditLog.initJournals(FSEditLog.java:276)
at org.apache.hadoop.hdfs.server.namenode.FSEditLog.initJournalsForWrite(FSEditLog.java:247)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:984)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1428)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1553)
2016-06-17 11:12:54,404 INFO util.ExitUtil (ExitUtil.java:terminate(124)) - Exiting with status 1
2016-06-17 11:12:54,405 INFO namenode.NameNode (LogAdapter.java:info(47)) - SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at localhost/127.0.0.1
************************************************************/
My core-site.xml
:
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://master:9000/</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>file:///Users/Joker/tmp</value>
</property>
</configuration>
My hdfs-site.xml
:
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file://code/java/hadoop2.7.0/dfs/name</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file://code/java/hadoop2.7.0/dfs/data</value>
</property>
</configuration>
I did not set HADOOP_HOME
, I just want to run NameNode
in source code.
Upvotes: 1
Views: 2976
Reputation: 3074
Actually you have set the path like this given below in core-site.xml and hdfs-site.xml
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://master:9000/</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/Users/Joker/tmp</value>
</property>
</configuration>
And in hdfs-site.xml like this.
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>/code/java/hadoop2.7.0/dfs/name</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/code/java/hadoop2.7.0/dfs/data</value>
</property>
</configuration>
Upvotes: 3