Reputation: 2301
I am using Hadoop 2.2 on my Ubuntu single node cluster. I have started hadoop cluster using start-all.sh. When I tried to load a text file in HDFS, it throws me following error.
hduser@ubuntu:~$ hadoop dfs -put /home/aditya/Desktop/data.txt
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
13/11/26 00:40:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
put: Call From ubuntu/127.0.1.1 to localhost:54310 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
These are my /etc/hosts file details. Plz check.
127.0.0.1 localhost
127.0.1.1 ubuntu
# The following lines are desirable for IPv6 capable hosts
::1 localhost ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters
ff02::3 ip6-allhosts
I searched & tried to solve this error, but didn't get any success. Please help me with your ideas. Thank You.
Upvotes: 0
Views: 4454
Reputation: 1263
What version of hadoop are you using? How many nodes do you have in the cluster? The error you're seeing usually results from /etc/hosts settings. Make sure all boxes can ping each other via name. I've removed all hostname to 127.0.1.1 mappings and bound hostnames to the IP in our small 2-node cluster (hadoop 2.2.0).
............
Please take a look at the stackoverflow link for /etc/hosts settings. Hadoop (local and host destination do not match) after installing hive
I strongly recommend looking at the Hadoop2 setup docs linked below since several things have changed.
Upvotes: 1