sawai singh
sawai singh

Reputation: 57

hadoop wordcount and upload file into hdfs

hello everyone i am very new in hadoop and i install hadoop in pseudo mode. configurations files are here

core-site.xml

<configuration>

   <property>
      <name>fs.default.name </name>
      <value> hdfs://localhost:9000 </value> 
   </property>

</configuration>

hdfs-site.xml

<configuration>

   <property>
      <name>dfs.replication</name>
      <value>1</value>
   </property>

   <property>
      <name>dfs.name.dir</name>
      <value>file:///home/hadoop_usr/hadoopinfra/hdfs/namenode </value>
   </property>

   <property>
      <name>dfs.data.dir</name> 
      <value>file:///home/hadoop_usr/hadoopinfra/hdfs/datanode </value> 
   </property>

</configuration>

and am successfully start datanode and namenode

Now i want to put my file into hdfs by using following way

adding file into hdfs what's going wrong why i get error message. Please help me to resolve this problem

If i using following way to put file into hdfs that time command is working fine. now i appand hdfs url. update file with hdfs url Please help me why i getting error in first way. Because when in running my wordcount.jar that time am also getting error message when i mentioned data.txt as input file on which operation sould be performed.

Thanks in advance.

Upvotes: 1

Views: 289

Answers (1)

Armin Braun
Armin Braun

Reputation: 3683

The reason the first put operation to data/data.txt is not working is likely that you do not have a folder data in your hdfs yet. You can just create it using hadoop fs -mkdir /data.

Upvotes: 1

Related Questions