Reputation: 117
All my nodes are up and running when we see using jps
command, but still I am unable to connect to hdfs filesystem. Whenever I click on Browse the filesystem
on the Hadoop Namenode localhost:8020 page, the error which i get is Connection Refused
. Also I have tried formatting and restarting the namenode but still the error persist. Can anyone please help me solving this issue.
Upvotes: 7
Views: 21169
Reputation: 1228
Change the core-site.xml
<property>
<name>fs.default.name</name>
<value>hdfs://hadoopvm:8020</value>
<final>true</final>
</property>
change to the ip adress .
<property>
<name>fs.default.name</name>
<value>hdfs://192.168.132.129:8020</value>
<final>true</final>
</property>
Upvotes: 2
Reputation: 384
HDFS may use port 9000 under certain distribution/build.
please double check your name node port.
Upvotes: 3
Reputation: 2419
Check whether all your services are running JobTracker, Jps, NameNode. DataNode, TaskTracker
by running jps
command.
Try to run start them one by one:
./bin/stop-all.sh
./bin/hadoop-daemon.sh start namenode
./bin/hadoop-daemon.sh start jobtracker
./bin/hadoop-daemon.sh start tasktracker
./bin/hadoop-daemon.sh start datanode
If you're still getting the error, stop them again and clean your temp storage directory. The directory details are in the config file ./conf/core-site.xml
and the run,
./bin/stop-all.sh
rm -rf /tmp/hadoop*
./bin/hadoop namenode -format
Check the logs in the ./logs
folder.
tail -200 hadoop*jobtracker*.log
tail -200 hadoop*namenode*.log
tail -200 hadoop*datanode*.log
Hope it helps.
Upvotes: 9