BJC
BJC

Reputation: 501

Hadoop not utilizing available memory

I have a 2 node cluster (each node having 32 GB RAM and 8 Cores). I have installed CDH 5.4. But I dont think the available memory is being utilized by hadoop, as the page :8088/cluster/apps shows up only 16 GB in "Memory Total" column. Only once I could see that the "Memory Total " as 64GB, not sure whats going on. What could be the reason?

Thanks, Baahu

Upvotes: 2

Views: 3712

Answers (1)

Amal G Jose
Amal G Jose

Reputation: 2546

You have to configure the memory that can be used per node for yarn. This setting is in yarn-site.xml. There are properties that governs the maximum memory that can be allocated for container allocation in the nodemanager. Seems like you are using a value of 8GB (default). Set the below property to a higher value.

yarn.nodemanager.resource.memory-mb

Similarly for cores, there is another property

yarn.nodemanager.resource.cpu-vcores. 

After setting these property in the yarn-site.xml of all the nodes, restart the yarn cluster. This will increase the nodemanager memory share. Along with this there are few more properties to tune the cluster. For more details, visit this url

Upvotes: 4

Related Questions