Reputation: 2255
I am running a browser on the single node Hortonworks Hadoop cluster(HDP 2.3.4) on Centos 6.7:
localhost:8000
and <hostname>:8000
, I can access Hue. Same works for Ambari at 8080<hostname>:50070
, I can access the namenode service. If I use localhost:50070
, I cannot setup a connection. So I assume localhost is blocked, the namenode not.How can I set up that localhost
and <hostname>
have the same port configuration?
Upvotes: 1
Views: 3817
Reputation: 9844
This likely indicates that the NameNode HTTP server socket is bound to a single network interface, but not the loopback interface. The NameNode HTTP server address is controlled by configuration property dfs.namenode.http-address
in hdfs-site.xml. Typically this specifies a host name or IP address, and this maps to a single network interface. You can tell it to bind to all network interfaces by setting property dfs.namenode.http-bind-host
to 0.0.0.0
(the wildcard address, matching all network interfaces). The NameNode must be restarted for this change to take effect.
There are similar properties for other Hadoop daemons. For example, YARN has a property named yarn.resourcemanager.bind-host
for controlling how the ResourceManager binds to a network interface for its RPC server.
More details are in the Apache Hadoop documentation for hdfs-default.xml and yarn-default.xml. There is also full coverage of multi-homed deployments in HDFS Support for Multihomed Networks.
Upvotes: 4