Reputation: 4706
I'm studying Hadoop and currently I'm trying to set up an Hadoop 2.2.0 single node. I downloaded the latest distribution, uncompressed it, now I'm trying to set up the Hadoop Distributed File System (HDFS).
Now, I'm trying to follow the Hadoop instructions available here but I'm quite lost.
In the left bar you see there are references to the following files:
But how those files are ?
I found /etc/hadoop/hdfs-site.xml, but it is empty!
I found /share/doc/hadoop/hadoop-project-dist/hadoop-common/core-default.xml but it is just a piece of doc!
So, what files I have to modify to configure HDFS ? Where the deaults values are read from ?
Thanks in advance for your help.
Upvotes: 20
Views: 64458
Reputation: 24910
For hadoop 3.2
, the default config can be found at:
Local installation
$HADOOP_HOME/share/doc/hadoop/
hadoop-project-dist/
hadoop-common/
core-default.xml
hadoop-hdfs/
hdfs-default.xml
hadoop-mapreduce-client/
hadoop-mapreduce-client-core/
mapred-default.xml
hadoop-yarn/
hadoop-yarn-common/
yarn-default.xml
Online (at http://hadoop.apache.org/docs/stable/, the Configuration
part on left bottom):
Effective config, in web console of local instance.
If you didn't change config, then default config is shown.
e.g
http://localhost:9870/conf
Upvotes: 2
Reputation: 4490
For Hortonworks location would be
/etc/hadoop/conf/hdfs-site.xml
Upvotes: 5
Reputation: 1399
These files are all found in the hadoop/conf directory.
For setting HDFS you have to configure core-site.xml and hdfs-site.xml.
HDFS works in two modes: distributed (multi-node cluster) and pseudo-distributed (cluster of one single machine).
For the pseudo-distributed mode you have to configure:
In core-site.xml:
<!-- namenode -->
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:8020</value>
</property>
In hdfs-site.xml:
<-- storage directories for HDFS - the hadoop.tmp.dir property, whose default is /tmp/hadoop-${user.name} -->
<property>
<name>hadoop.tmp.dir</name>
<value>/your-dir/</value>
</property>
Each property has its hardcoded default value.
Please remember to set ssh password-less login for hadoop user before starting HDFS.
P.S.
It you download Hadoop from Apache, you can consider switching to a Hadoop distribution:
Cloudera's CDH, HortonWorks or MapR.
If you install Cloudera CDH or Hortonworks HDP you will find the files in /etc/hadoop/conf/.
Upvotes: 19
Reputation: 151
these files can be seen here /usr/lib/hadoop-2.2.0/etc/hadoop, in that location u can find all the XMLs.
Upvotes: 1
Reputation: 7255
All the configuration files will be located in the extracted tar.gz file in the etc/hadoop/ directory. The hdfs-site.xml may be hdfs-site.xml.template. You will need to rename it to hdfs-site.xml.
If you want to see what options for hdfs check the doc in the tarball in share/doc/hadoop/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml
Upvotes: 4