Reputation: 6139
root# bin/hadoop fs -mkdir t
mkdir: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/root/t. Name node is in safe mode.
not able to create anything in hdfs
I did
root# bin/hadoop fs -safemode leave
But showing
safemode: Unknown command
what is the problem?
Solution: http://unmeshasreeveni.blogspot.com/2014/04/name-node-is-in-safe-mode-how-to-leave.html?m=1
Upvotes: 140
Views: 241700
Reputation: 131
safe mode on means (HDFS is in READ only mode)
safe mode off means (HDFS is in Writeable and readable mode)
In Hadoop 2.6.0
, we can check the status of name node with help of the below commands:
TO CHECK THE name node status
$ hdfs dfsadmin -safemode get
TO ENTER IN SAFE MODE:
$ hdfs dfsadmin -safemode enter
TO LEAVE SAFE mode
~$ hdfs dfsadmin -safemode leave
Upvotes: 13
Reputation: 89
Try this
sudo -u hdfs hdfs dfsadmin -safemode leave
check status of safemode
sudo -u hdfs hdfs dfsadmin -safemode get
If it is still in safemode ,then one of the reason would be not enough space in your node, you can check your node disk usage using :
df -h
if root partition is full, delete files or add space in your root partition and retry first step.
Upvotes: 7
Reputation: 539
use below command to turn off the safe mode
$> hdfs dfsadmin -safemode leave
Upvotes: 0
Reputation: 61
Run the command below using the HDFS OS user to disable safe mode:
sudo -u hdfs hadoop dfsadmin -safemode leave
Upvotes: 1
Reputation: 12010
In order to forcefully let the namenode leave safemode, following command should be executed:
bin/hadoop dfsadmin -safemode leave
You are getting Unknown command
error for your command as -safemode
isn't a sub-command for hadoop fs
, but it is of hadoop dfsadmin
.
Also after the above command, I would suggest you to once run hadoop fsck
so that any inconsistencies crept in the hdfs might be sorted out.
Update:
Use hdfs
command instead of hadoop
command for newer distributions. The hadoop
command is being deprecated:
hdfs dfsadmin -safemode leave
hadoop dfsadmin
has been deprecated and so is hadoop fs
command, all hdfs related tasks are being moved to a separate command hdfs
.
Upvotes: 232
Reputation: 6214
If you use Hadoop version 2.6.1 above, while the command works, it complains that its depreciated. I actually could not use the hadoop dfsadmin -safemode leave
because I was running Hadoop in a Docker container and that command magically fails when run in the container, so what I did was this. I checked doc and found dfs.safemode.threshold.pct
in documentation that says
Specifies the percentage of blocks that should satisfy the minimal replication requirement defined by dfs.replication.min. Values less than or equal to 0 mean not to wait for any particular percentage of blocks before exiting safemode. Values greater than 1 will make safe mode permanent.
so I changed the hdfs-site.xml
into the following (In older Hadoop versions, apparently you need to do it in hdfs-default.xml
:
<configuration>
<property>
<name>dfs.safemode.threshold.pct</name>
<value>0</value>
</property>
</configuration>
Upvotes: 8
Reputation: 2614
try this, it will work
sudo -u hdfs hdfs dfsadmin -safemode leave
Upvotes: 31
Reputation: 1472
Namenode enters into safemode when there is shortage of memory. As a result the HDFS becomes readable only. That means one can not create any additional directory or file in the HDFS. To come out of the safemode, the following command is used:
hadoop dfsadmin -safemode leave
If you are using cloudera manager:
go to >>Actions>>Leave Safemode
But it doesn't always solve the problem. The complete solution lies in making some space in the memory. Use the following command to check your memory usage.
free -m
If you are using cloudera, you can also check if the HDFS is showing some signs of bad health. It probably must be showing some memory issue related to the namenode. Allot more memory by following the options available. I am not sure what commands to use for the same if you are not using cloudera manager but there must be a way. Hope it helps! :)
Upvotes: 5
Reputation: 5335
The Command did not work for me but the following did
hdfs dfsadmin -safemode leave
I used the hdfs
command instead of the hadoop
command.
Check out http://ask.gopivotal.com/hc/en-us/articles/200933026-HDFS-goes-into-readonly-mode-and-errors-out-with-Name-node-is-in-safe-mode- link too
Upvotes: 22