tinytelly
tinytelly

Reputation: 279

Docker memory limit in pipelines

I use atlassian pipelines.

Sometimes pipelines fail with this message

Build memory limit exceeded.

I also run the same docker image locally as I run in pipelines. (I run the same image locally as I declare in the image: of bitbucket-pipelines.yml)

I want a way to run our docker image locally with the same limitations that pipelines enforces (4gb) : https://confluence.atlassian.com/bitbucket/limitations-of-bitbucket-pipelines-827106051.html#LimitationsofBitbucketPipelines-Buildlimits

I want to do this to make sure I am staying within 90% of what the limit is of pipelines. (so 3.6GB)

Is this a way to achieve that?

docker run --rm --memory=3600M --memory-swap=3600M docker-image-same-as-we-run-in-pipelines

Upvotes: 3

Views: 3905

Answers (1)

Vibhanshu Biswas
Vibhanshu Biswas

Reputation: 379

Buddy this is pain in the butt. I have been in your shoes before. the problem is even if you might be in the limits some percentage is occupied by the pipeline itself. So I have an angular project which builds pretty much under 4GB memory. but running on the pipeline exceeds the memory. So I have to use the Size: 2x parameter in the pipe in order to allocate double the memory to the pipeline. the only drawback is that build minutes are counted as double while billing. so if you run it for 4 minutes it is billed at 8 minutes from your quota.

if there is another solution with docker then I dont know but to run the solution you can do this. or there is one more approach, recently Atlassian introduced Runners concept where you can attach on-premise runner to build your pipelines this way you can leverage your hardware and be exempted from getting billed from Atlassian. I havent tried it though. https://support.atlassian.com/bitbucket-cloud/docs/runners/

Upvotes: 1

Related Questions