Ree
Ree

Reputation: 903

Hadoop datanodes cannot find namenode in standalone setup

There are no errors in any log but I believe my datanode cannot find my namenode.

This is the error that leads me to this conclusion (according to what I've found online):

[INFO ]: org.apache.hadoop.ipc.Client - Retrying connect to server: /hadoop.server:9000. Already tried 4 time(s). 

jps output:

7554 Jps
7157 NameNode
7419 SecondaryNameNode
7251 DataNode

Please can someone offer some advice?

Result of dfsadmin

Configured Capacity: 13613391872 (12.68 GB)
Present Capacity: 9255071744 (8.62 GB)
DFS Remaining: 9254957056 (8.62 GB)
DFS Used: 114688 (112 KB)
DFS Used%: 0.00%
Under replicated blocks: 0
Blocks with corrupt replicas: 0
Missing blocks: 0

-------------------------------------------------
Datanodes available: 1 (1 total, 0 dead)

Live datanodes:
Name: 192.172.1.49:50010 (Hadoop)
Hostname: Hadoop
Decommission Status : Normal
Configured Capacity: 13613391872 (12.68 GB)
DFS Used: 114688 (112 KB)
Non DFS Used: 4358320128 (4.06 GB)
DFS Remaining: 9254957056 (8.62 GB)
DFS Used%: 0.00%
DFS Remaining%: 67.98%
Configured Cache Capacity: 0 (0 B)
Cache Used: 0 (0 B)
Cache Remaining: 0 (0 B)
Cache Used%: 100.00%
Cache Remaining%: 0.00%
Last contact: Fri Aug 08 17:25:57 SAST 2014

Upvotes: 0

Views: 768

Answers (2)

Abhishek
Abhishek

Reputation: 143

Give a hostname to your machines and make their entries in the /etc/hosts file, like this ,

#hostname hdserver.example.com
#vim /etc/hosts
192.168.0.25 hdserver.example.com
192.168.0.30 hdclient.example.com

and save it.(Use correct IP addresses)

On client also give hostname hdclient.example.com and make above entries in /etc/hosts. This will help the nameserver to locate the machines with hostnames.

Upvotes: 1

Saran Reddy
Saran Reddy

Reputation: 9

delete all contents from tmp folder: rm -Rf path/of/tmp/directory

format namenode: :bin/hadoop namenode -format

start all processes again : bin/start-all.sh

Upvotes: 0

Related Questions