Reputation: 935
I'm trying to build an image in one job and push to AWS ECR in another, since the steps are different I'm trying to pass the image as an artifact:
.gitlab-ci.yml:
stages:
- build
- push
build_image:
stage: build
image: docker
services:
- docker:19.03.12-dind
script:
# building docker image....
- mkdir image
- docker save apis_server > image/apis_server.tar
artifacts:
paths:
- image
push_image:
stage: push
image: docker
services:
- docker:19.03.12-dind
before_script:
- apk add --no-cache python3 py3-pip && pip3 install --upgrade pip && pip3 install --no-cache-dir awscli
script:
- ls
- docker load -i image/apis_server.tar
- docker images
# ecr auth and push to repo...
I get the following warning in the pipeline:
Uploading artifacts for successful job
Uploading artifacts...
WARNING: image: no matching files. Ensure that the artifact path is relative to the working directory
The second job fails with the following message:
$ docker load -i image/apis_server.tar
open image/apis_server.tar: no such file or directory
This approach is based on the answer provided here
Upvotes: 0
Views: 2715
Reputation: 349
For your question, use the full directory address for artifacts.
I have some recommendations for you to speed up you pipeline. If you always install some packages in your pipeline, make a docker image based on your requirements, then use that image in your pipeline instead.
If you need to deploy an image in another place, I recommend you to use docker hub, or make a self hosted docker repository. It is more efficient. Because in docker deployment, the changed layers will be downloaded. But the way you are using, you download all the layers.
Upvotes: 1