user5699859
user5699859

Reputation:

AWS - Hadoop cluster - Nodes started but not working

I have my hadoop cluster setup in my local servers which is working fine and I am replicating that to AWS servers (1 Master NN, 1 Secondary Name Node, 7 Slaves), I am able to start my hadoop. But I am not able to open pages like :50070/dfshealth.jsp. I have done my installations properly and ssh public key authentications also in the same way i did in local setup. there is no unusual thing in logs also. Is there anything else i can look into ?

Upvotes: 1

Views: 93

Answers (2)

Durga Viswanath Gadiraju
Durga Viswanath Gadiraju

Reputation: 3956

You need to open up ports in security group. Make sure you use same security group on all the nodes in the cluster. Also you will be able to connect only by using elastic ip/public dns by default (starts with ec2*). If you want to use private ips you need to do SSH tunneling.

Upvotes: 0

Thanga
Thanga

Reputation: 8101

If the logs are good, then Ensure the necessary Hadoop ports are opened. Unlike your local setup, In AWS, You should ask them for specific ports to open. In this case, you have to ask for the Hadoop http and RPC ports needed to open (if not). This will surely solve the issue.

Upvotes: 1

Related Questions