user468587
user468587

Reputation: 5031

how to list dead datanode using hadoop shell commands

i can view which data nodes are dead on the hadoop dfsnodelist page at port 50070, but is there any command line tool that i can run to return the same info? i've tried

hadoop dfsadmin -report | grep 'Datanodes'

that only tells me how many nodes in total, how many alive and how many dead, is there any way it return list of dead nodes with their name instead of their IPs?

Upvotes: 0

Views: 3156

Answers (2)

Guan Jyun Chen
Guan Jyun Chen

Reputation: 123

type hdfs dfsadmin -report

this command will show live node and deadnode

Upvotes: 0

P Joslin
P Joslin

Reputation: 41

Add the "-dead" option, then grep for "Name:". This will omit the info for live nodes, and only print the identifying info for the dead ones.

 $ sudo -u hdfs hdfs dfsadmin -report -dead | grep Name:
Name: 10.1.3.5:50010 (cc005.fakedomain.local)
Name: 10.1.3.11:50010 (cc011.fakedomain.local)
Name: 10.1.3.20:50010 (cc020.fakedomain.local)

This perl oneliner strips all but the hostnames:

 sudo -u hdfs hdfs dfsadmin -report -dead |
   perl -ne 'next unless m/Name:/;' -e 's/^.*\((.*)\).*/\1 /;' -e 'print;'
cc005.fakedomain.local
cc011.fakedomain.local
cc020.fakedomain.local

Upvotes: 2

Related Questions