2D_
2D_

Reputation: 611

hdfs dfs -mkdir, No such file or directory

Hi I am new to hadoop and trying to create directory in hdfs called twitter_data. I have set up my vm on softlayer, installed & started hadoop successfully.

This is the commend I am trying to run:

hdfs dfs -mkdir hdfs://localhost:9000/user/Hadoop/twitter_data

And it keeps returning this error message:

 /usr/local/hadoop/etc/hadoop/hadoop-env.sh: line 2: ./hadoop-env.sh: Permission denied
16/10/19 19:07:03 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: `hdfs://localhost:9000/user/Hadoop/twitter_data': No such file or directory

Why does it say there is no such file and directory? I am ordering it to make directory, shouldn't it just create one? I am guessing it must be the permission issue, but I cant resolve it. Please help me hdfs experts. I have been spending too much time on what seems to be a simple matter.

Thanks in advance.

Upvotes: 40

Views: 69992

Answers (3)

Abhinav Kumar
Abhinav Kumar

Reputation: 1

Only for windows user

start in windows power Shell

1.cd c:/hadoopsetup

2.cd c:/hadoop-3.2.4/sbin/

3 ./start-dfs.cmd (it shows multiple command prompt popup do not close it just Minimise it)

4 ./start-yarn.cmd (it shows multiple command prompt popup do not close it just Minimise it)

5 jps (check for namenode,datanode,NodeManager,ResourceManager,jps present or not)

after that use

hadoop fs -mkdir /xyzfilename (always use /)

hadoop fs -ls /

Upvotes: 0

Vipal Patel
Vipal Patel

Reputation: 119

use the below steps command to create the directory:

1) don't run the hadoop and format the namenode:-

$ hadoop namenode -format

2) run hadoop by :-

$ start-all.sh

3)now first make the initial directory then create the another in same directory:

$ hadoop fs -mkdir /user
$ hadoop fs -mkdir /user/Hadoop
$ hadoop fs -mkdir /user/Hadoop/tweeter_data

Follow the above steps to solve the problem.

Upvotes: 11

user4601931
user4601931

Reputation: 5304

It is because the parent directories do not exist yet either. Try hdfs dfs -mkdir -p /user/Hadoop/twitter_data. The -p flag indicates that all nonexistent directories leading up to the given directory are to be created as well.

As for the question you posed in the comments, simply type into your browser http://<host name of the namenode>:<port number>/.

Upvotes: 60

Related Questions