郑善宇
郑善宇

Reputation: 41

Hadoop setup issue: "ssh: Could not resolve hostname now.: No address associated with hostname"

When I build hadoop cluster based on vmware, and I use sbin/start-dfs.sh command, I meet the problem about ssh. It says,

ssh: Could not resolve hostname now.: No address associated with hostname

I have used vi /etc/hosts command to check the hostname and IP address, and vi /etc/profile command. I ensure that there is no fault.

Upvotes: 2

Views: 6938

Answers (1)

Weiwei Yang
Weiwei Yang

Reputation: 19091

Few suggestions

  1. Check if the hostnames in hdfs-site.xml is set correctly. If you are running with single host setup, and you set namenode host as localhost, you need to make sure localhost mapped to 127.0.0.1 in your /etc/hosts. If you are setting multiple nodes, make sure you use FQDN of each host in your configuration, and make sure each FQDN mapped to the correct IP address in /etc/hosts.
  2. Setup passwordless SSH. Note start-dfs.sh requires that you have passwordless SSH setup from the host where you run this command to rest of cluster nodes. Verify this by ssh hostx date and it doesn't ask for a password.
  3. Check the hostname in the error message (maybe you did not paste the complete log), for the problematic hostname, run SSH command manually to make sure it can be resolved. If not, check /etc/hosts. A common /etc/hosts setup looks like

127.0.0.1 localhost localhost.localdomain

::1 localhost localhost.localdomain

172.16.151.224 host1.test.com host1

172.16.152.238 host2.test.com host2

172.16.153.108 host3.test.com host3

Upvotes: 1

Related Questions