Reputation: 4332
Some time ago, our Gitlab (self-hosted) instance started to throw errors that the archives are too big:
ERROR: Uploading artifacts as "archive" to coordinator... too large archive id=something responseStatus=413 Request Entity Too Large status=413 token=something FATAL: too large
ERROR: Job failed: exit code 1
The only resolution ideas we found was to set the max build artifact size (it's under /admin/application_settings). This did not work for us, the error still occurred.
Reference articles:
Upvotes: 17
Views: 47001
Reputation: 2567
In my case build was too large because of artifacts:untracked
flag. Setting it to false solved the problem.
Upvotes: 0
Reputation: 19
Problem: Uploading artifacts as "archive" to coordinator... 413 Request Entity Too Large
Solution: Try cleaning cache of GitLab runner
Upvotes: 1
Reputation: 2037
From gitlab official docs:
The maximum size of the job artifacts can be set at:
The value is in MB and the default is 100MB per job.
To change it at the instance level:
On the top bar, select Menu > Admin
.
On the left sidebar, select Settings > CI/CD
.
Change the value of maximum artifacts size (in MB).
Select Save changes for the changes to take effect.
Group level (this overrides the instance setting):
To change it at the group level:
Go to the group’s Settings > CI/CD > General Pipelines
.
Change the value of maximum artifacts size (in MB).
Select Save changes for the changes to take effect.
Project level (this overrides the instance and group settings):
To change it at the project level:
Go to the project’s Settings > CI/CD > General Pipelines
.
Change the value of maximum artifacts size (in MB).
Select Save changes for the changes to take effect.
Upvotes: 31
Reputation: 4332
The solution to this issue is to set the max build artifact size (under /admin/application_settings) and to increase the Gitlab NGINX client_max_body_size
property in the configuration file to something higher.
Upvotes: 15