shiv
shiv

Reputation: 65

OutOfMemoryError: Container killed due to memory usage on AWS batch job

I am using AWS ECS for for running a CI/CD job, but I am getting following error:

OutOfMemoryError: Container killed due to memory usage

The same job, when run with container on separate EC2, runs fine. How can I troubleshoot this?

Upvotes: 3

Views: 8156

Answers (2)

deesolie
deesolie

Reputation: 1062

I agree with what @shubham recommended to increase memory to the container. A couple of things to also be mindful of:

  1. In your task definition, make sure the "Resource allocation limits" are in line with what your task memory limits are. Meaning, if you set your task memory to 6GB for example, but your limits are hard capped at 4GB, you will still receive this error
  2. Per this SO answer, you may want to also increase your ulimit memlock hard and soft limits (note this is in KB) or your nofile limits (though I doubt nofile is causing the issue). For example, if you set your task memory to have 6GB and expect your batch job to be close to this then you will want to increase your ulimit memlock (also in the task definition) to be 6GB (or 6291456, 1024 * 1024 * 6)

Upvotes: 0

shubham
shubham

Reputation: 182

Try to go through the following article by AWS: https://aws.amazon.com/premiumsupport/knowledge-center/ecs-resolve-outofmemory-errors/#:~:text=To%20troubleshoot%20OutOfMemory%20errors%20in,occur%20due%20to%20memory%20usage.

I'd suggest looking into the latest changes after which this issue started (i am considering this used to work before).

If this is the first time you are deploying this, i'd recommend providing more memory to the container.

Upvotes: 3

Related Questions