Reputation: 361
I am seeing bitbucket pipeline error - container 'docker' exceeded memory limit while running bitbucket pipeline. I tried to use all possible service memory limits as per below documentation, but issue was not resolved.
Databases and service containers - Service memory limits
Can you help resolve the issue?
Upvotes: 25
Views: 27009
Reputation: 31
Updated pipeline yaml to this and its worked:
- step:
name: 'Bitbucket pipeline test'
services:
- docker
size: 2x
definitions:
services:
docker:
memory: 4096 # as per your requirement
Upvotes: 2
Reputation: 91
As said previously you can use size: 2x
on a step to increase the memory limit for that step or set it in options which will enable 2x size for all steps automatically.
However, it is worth noting that doing so will consume twice the number of build minutes compared to a regular step, effectively costing twice as much, as described here
Upvotes: 6
Reputation: 229
It is due to your build take more memory than allocated In order to resolve this you need add this in your bitbucket-pipelines.yml
image: .....
options: <= Add this
docker: true <= Add this
size: 2x <= Add this
pipelines:
branches:
branches:
master:
- step:
caches:
- ....
services: <= Add this
- docker <= Add this
definitions: <= Add this
services: <= Add this
docker: <= Add this
memory: 4096 <= Add this
Upvotes: 17
Reputation: 721
I contacted bitbucket, and they provided a solution:
options:
docker: true
size: 2x
name: XXXX
image: google/cloud-sdk:latest
services:
- docker
size: 2x
definitions:
services:
docker:
memory: 4096
Upvotes: 41