Reputation: 1927
I have installed hadoop 2.6.0 and I'm playing around with it. I'm trying the Pseudo-distributed setup and I'm following the instructions on http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html#Execution I'm stuck at the 5th step i.e. when I run the command
bin/hdfs dfs -put etc/hadoop input
I get the below error.
15/02/02 00:35:49 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
put: `input': No such file or directory
Why am I getting this error? How can I resolve it?
Upvotes: 9
Views: 30451
Reputation: 1
change the user:owner, if want to write any file from root to hdfs directly
sudo -u hdfs hdfs dfs -chown root:hdfs /user/file --{/file}
sudo -u hdfs hdfs dfs -chmod -R 775 /user/file
Or
sudo -u hdfs hdfs dfs -chown -R hdfs:hadoop /user/file
sudo -u hdfs hdfs dfs -chmod -R 1777 /user/file
then use put command
sudo -u hdfs hdfs dfs -put /root/project/* /file --{/user/file}
works for me
[root@spark ~]# sudo -u hdfs hdfs dfs -put /root/project/* /file/
put: `file/': No such file or directory
[root@spark ~]# hdfs dfs -put /root/project/* /file
put: Permission denied: user=root, access=WRITE, inode="/file":hdfs:hadoop:drwxr-xr-t
[root@spark ~]# sudo -u hdfs hdfs dfs -chown root:hdfs /file
[root@spark ~]# hdfs dfs -put /root/project/*.csv /file
[root@spark ~]# hdfs dfs -ls /file
Found 12 items
rw-r--r-- 1 root hdfs 4662272 2019-04-28 06:23 /file/StokKs.csv
rw-r--r-- 1 root hdfs 302648 2019-04-28 06:23 /file/Stocks.csv
rw-r--r-- 1 root hdfs 284628 2019-04-28 06:23 /file/Stocks.csv
rw-r--r-- 1 root hdfs 568949 2019-04-28 06:23 /file/Satellite.csv
rw-r--r-- 1 root hdfs 579302 2019-04-28 06:23 /file/Stocks.csv
rw-r--r-- 1 root hdfs 24805721 2019-04-28 06:23 /file/medical.csv
rw-r--r-- 1 root hdfs 5650234 2019-04-28 06:23 /file/bank.csv
rw-r--r-- 1 root hdfs 2893092 2019-04-28 06:23 /file/facebook.csv
Upvotes: 0
Reputation: 515
Just put "/" infront of input as it is a directory.
./bin/hdfs dfs -put etc/hadoop /input
hope this helps
Upvotes: 4
Reputation: 1
There are two errors first one is native hadoop library for your platform. This is because you have not installed hadoop winutils for your hadoop version. Check this answer for more details https://stackoverflow.com/a/46382570/6337190 The second error is no such file or directory. This is because you have to specify path correctly. Change directory to your hadoop/bin/ and write commands
To make directory
hdfs dfs -mkdir /input
To put file in directory
hdfs dfs -put /path/to/file.txt /input
To check file in directory
hdfs dfs -ls /input
Upvotes: 0
Reputation: 39
SOLVED: 1. Make your directory in hdfs hdfs dfs -mkdir /input_file_name 2. Copy data to hdfs. hadoop fs -put filename.txt /input_file_name/output_file_name
Upvotes: 0
Reputation: 61
In the above question two parts:
hadoop fs -mkdir /hadoopinput
OR [For new version]
hdfs dfs -mkdir /hadoopinput
Now U can put file inside the folder:
hdfs dfs -put /Users/{username}/Desktop/file01 /hadoopinput
To check the file is copied inside the folder or not use following command:
hdfs dfs -ls /hadoopinput
Upvotes: 0
Reputation: 493
In addition to what Ashrith wrote -p can also be added, just in case the directory is not yet created.
bin/hadoop fs -mkdir -p /path/to/hdfs/dir
Hope this helps someone else.
Upvotes: 11
Reputation: 6855
You are getting the error, because there is no such directory specified in the path. Please take a look at my answer to a similar question which explains how hadoop interprets relative path's.
Make sure you create the directory first using:
bin/hadoop fs -mkdir input
and then try to re-execute the command -put
.
Upvotes: 3