Reputation: 71
For over a week I've been fighting with passing a variables between jobs in multi-project pipeline in GitLab CI and got lots of weird errors. The mechanism looks like very basic and it drives me crazy that such an obvious thing still doesn't work for me, if somebody ran into similar issues - I would appreciate your help!
So what I've been trying to make: I have two projects on gitlab and I'm trying to link them in a single multi-project pipeline, the jobs schema looks like this: In project A:
variables: BUILD_PATH:""
build:
script:
- $BUILD_PATH="some-path" #the important point here that this value sets inside the job, it's not static
bridge:
variables:
PATH: $BUILD_PATH
RUN_TYPE: test #this value is a static and it passes correctly, no issues here
trigger:
project: project-B-path
In project B:
variables:
PATH: ""
RUN_TYPE: ""
test:
script:
echo "From upstream pipeline dynamic: $PATH"
echo "From upstream pipeline static: $RUN_TYPE"
...
When I run it on CI I have the $RUN_TYPE variable correctly passed and empty value in $PATH variable (even though, $BUILD_PATH has the correct value during run of the build job). Tried many approaches - to set the $BUILD_PATH value in before script, to pass environment value (like CI_JOB_ID) to job in project B, to not create this variable at all in project B, etc. Nothing helped, dynamic variable always has empty value.
Then I've tried to save the dynamic var $BUILD_PATH in .env file and to publish it as artefact, so the bridge job could read it from there. I did it like that:
build:
script:
- some code here
- echo "BUILD_VERSION=hello" >> vars.env
artifacts:
reports:
dotenv: vars.env
When I run it on CI job always fails with errors like :
Uploading artifacts...
825vars.env: found 1 matching files and directories
826WARNING: Failed to load system CertPool: crypto/x509: system root pool is not available on Windows
827WARNING: Uploading artifacts as "dotenv" to coordinator... failed id=1877748 responseStatus=500 Internal Server Error status=500 token=some-token-here
828WARNING: Retrying... context=artifacts-uploader error=invalid argument
829WARNING: Uploading artifacts as "dotenv" to coordinator... failed id=1877748 responseStatus=500 Internal Server Error status=500 token=some-token-here
830WARNING: Retrying... context=artifacts-uploader error=invalid argument
831WARNING: Uploading artifacts as "dotenv" to coordinator... failed id=1877748 responseStatus=500 Internal Server Error status=500 token=some-token-here
832FATAL: invalid argument
I've also tried uploading .env file without name as I saw somewhere, like
- echo "BUILD_VERSION=hello" >> .env
but again no luck, same 500 error. I keep researching this error, but so far - it's with me.
So the point - none of the ways of passing variables to the downstream pipeline in multi-project pipeline worked for me. If anyone met same issues or made it work in a different ways - please help
UPDATE: Resolved this issue in a different way - with cUrl trigger from project A like:
- curl --request POST --form "token=$CI_JOB_TOKEN" --form ref=branchName --form "variables[PATH]=$BUILD_PATH" "https://gitlab/api/v4/projects/projectID/trigger/pipeline"
Upvotes: 7
Views: 15986
Reputation: 1
Have you tried forwarding the pipeline variables? It solved my issue.
deploy:
stage: deploy
trigger: my/downstream_project
forward:
pipeline_variables: true
Upvotes: 0
Reputation: 194
To pass variable values from upstream pipeline to downstream pipeline we can use 'forward' together with 'trigger'
#pass MY_VARIBALE value via CI/CD pipeline as user input
#variables that define in upstream pipeline
variables:
GLOBAL_VARIABLE: "value"
upstream-job:
needs:
- someOtherJobIfNeeded
trigger:
include:
- local: "downstream_gitlab-ci.yml"
forward:
pipeline_variables: true
strategy: depend
only:
- master
downstream_Job:
needs:
- someOtherJobIfNeeded
before_script:
- |
if [[ -z "$MY_VARIBALE" ]]; then
echo "Error: MY_VARIBALE variable not set as CI/CD varibale."
exit 1
else
echo "$MY_VARIBALE"
fi
if [[ -z "$GLOBAL_VARIABLE" ]]; then
echo "Error: GLOBAL_VARIABLE variable not set at upstream pipeline."
exit 1
else
echo "$GLOBAL_VARIABLE"
fi
script:
- someOtherScriptIfNeeded.sh
only:
- master
Ref : https://docs.gitlab.com/ee/ci/yaml/#triggerforward
Upvotes: 0
Reputation: 133
I was able to do this by using build.env
upstream project:
stages:
- build
- deploy
build_job:
stage: build
script:
- echo var="defined in the job" >> $CI_PROJECT_DIR/build.env
artifacts:
reports:
dotenv: build.env
trigger:
stage: deploy
trigger:
project: path/to/downstream
downstream project:
downstream:
script:
- echo $var
needs:
- project: path/to/upstream
job: build_job
as a bonus, you could also pass along the upstream project's path and branch by adding this block to the trigger
job:
variables:
UPSTREAM_REF_NAME: $CI_COMMIT_REF_NAME
UPSTREAM_PROJECT_PATH: $CI_PROJECT_PATH
and then change the downstream project's needs
block to
needs:
- project: $UPSTREAM_PROJECT_PATH
ref: $UPSTREAM_REF_NAME
job: buildinfo
Upvotes: 7
Reputation: 40861
Your downstream project job needs to declare needs:
on the upstream project job.
upstream project:
build_vars:
stage: build
script:
- echo "BUILD_VERSION=hello" >> build.env
artifacts:
reports:
dotenv: build.env
deploy:
stage: deploy
trigger: my/downstream_project
downstream project:
test:
stage: test
script:
- echo $BUILD_VERSION
needs:
- project: my/upstream_project
job: build_vars
ref: main
artifacts: true
Upvotes: 3