Reputation: 1353
My spark job fails with following error : Diagnostics: Container [pid=7277,containerID=container_1528934459854_1736_02_000001] is running beyond physical memory limits. Current usage: 1.4 GB of 1.4 GB physical memory used; 3.1 GB of 6.9 GB virtual memory used. Killing container.
Upvotes: 0
Views: 153
Reputation: 666
Your containers are getting killed. This happens when your Yarn memory is not as much as required to perform the task. So, the possible solution is to increase Yarn memory.
You have 2 choices:
It will increase the Yarn Memory and make sure it's around 2 GB at least.
Upvotes: 0