user1844086
user1844086

Reputation:

java.net.ConnectException: Connection refused error when running Hive

I'm trying work through a hive tutorial in which I enter the following:

load data local inpath '/usr/local/Cellar/hive/0.11.0/libexec/examples/files/kv1.txt' overwrite into table pokes;

Thits results in the following error:

FAILED: RuntimeException java.net.ConnectException: Call to localhost/127.0.0.1:9000 failed on connection exception: java.net.ConnectException: Connection refused

I see that there are some replies on SA having to do with configuring my ip address and local host, but I'm not familiar with the concepts in the answers. I'd appreciate anything you can tell me about the fundamentals of what causes this kind of answer and how to fix it. Thanks!

Upvotes: 8

Views: 39063

Answers (6)

Vadim Zin4uk
Vadim Zin4uk

Reputation: 1806

I had a similar problem with a connection timeout:

WARN DFSClient: Failed to connect to /10.165.0.27:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection timed out: no further information

DFSClient was resolving nodes by internal IP. Here's the solution for this:

.config("spark.hadoop.dfs.client.use.datanode.hostname", "true")

Upvotes: 0

Keshav Pradeep Ramanath
Keshav Pradeep Ramanath

Reputation: 1687

I was able to resolve the issue by executing the below command:

start-all.sh

This would ensure that the Hive service has started.

Then starting the Hive was straight forward.

Upvotes: 0

lukalau
lukalau

Reputation: 1

same question when set up hive. solved by change my /etc/hostname

formerly it is my user_machine_name after I changed it to localhost, then it went well

I guess it is because hadoop may want to resolve your hostname using this /etc/hostname file, but it directed it to your user_machine_name while the hadoop service is running on localhost

Upvotes: 0

Amey Jadiye
Amey Jadiye

Reputation: 3154

Easy way i found to edit the /etc/hosts file. default it looks like

127.0.0.1    localhost
127.0.1.1    user_user_name

just edit and make 127.0.1.1 to 127.0.0.1 thats it , restart your shell and restart your cluster by start-all.sh

Upvotes: 0

Haimei
Haimei

Reputation: 13015

The reason why you get this error is that Hive needs hadoop as its base. So, you need to start Hadoop first.

Here are some steps.

Step1: download hadoop and unzip it

Step2: cd #your_hadoop_path

Step3: ./bin/hadoop namenode -format

Step4: ./sbin/start-all.sh

And then, go back to #your_hive_path and start hive again

Upvotes: 4

vishnu viswanath
vishnu viswanath

Reputation: 3854

This is because hive is not able to contact your namenode

Check if your hadoop services has started properly.

Run the command jps to see what all services are running.

Upvotes: 12

Related Questions