Reputation: 22837
I'm trying to run a shell script from my template file located in another project via my include.
How should this be configured to work? Below scripts are simplified versions of my code.
template.yml
deploy:
before_script:
- chmod +x ./.run.sh
- source ./.run.sh
gitlab-ci.yml
include:
- project: 'project-a'
ref: master
file: '/template.yml'
stages:
- deploy
Clearly, the commands are actually being run from ProjectB and not ProjectA where the template resides. This can further be confirmed by adding ls -a
in the template file.
So how should we be calling run.sh
? Both projects are on the same GitLab instance under different groups.
Upvotes: 24
Views: 52358
Reputation: 149
ref This :: https://docs.gitlab.com/ee/api/repository_files.html#get-file-from-repository
GET /projects/:id/repository/files/:file_path/raw
curl --header "PRIVATE-TOKEN: <your_access_token>" "https://gitlab.example.com/api/v4/projects/13083/repository/files/app%2Fmodels%2Fkey%2Erb?ref=master"
It will display the file.
To download this file just add >> <filename>
as below
curl --header "PRIVATE-TOKEN: <your_access_token>" "https://gitlab.example.com/api/v4/projects/13083/repository/files/app%2Fmodels%2Fkey%2Erb?ref=master" >> file.extension
Upvotes: -1
Reputation: 743
it's also possible to request a script by curl instead of copying a whole repository:
- curl -H "PRIVATE-TOKEN:$PRIVATE_TOKEN" --create-dirs "$CI_API_V4_URL/projects/$CI_DEPLOY_PROJECT_ID/repository/archive?path=pathToFolderWithScripts" -o $TEMP_DIR/archive.tar.gz
- tar zxvf $TEMP_DIR/archive.tar.gz -C $TEMP_DIR --strip-components 3
- bash $TEMP_DIR/run.sh
Upvotes: 4
Reputation: 131
We've also run into this problem, and kinda wish Gitlab allowed include
s to "import" non-yaml files. Nevertheless the simplest workaround we've found is to build a small docker image in repo A, which contains the script you want to run, and then repo B's job uses that docker image as the image
, so the file run.sh
is available :)
Minimal Dockerfile:
FROM bash:latest
COPY run.sh /usr/local/bin/
CMD run.sh
(Note: make sure you chmod +x run.sh
before building your image, or add a RUN chmod +x /usr/local/bin/run.sh
step)
Then, you'd just add this to your Project B's .gitlab-ci.yml
:
stages:
- deploy
deploy:
image: registry.gitlab.com/... # Wherever you pushed your docker image to
script: run.sh
Upvotes: 13
Reputation: 21
As hinted by the answer above, multi project pipelines is the right approach for it.
Here's how it worked for me:
GroupX/ProjectA - contains reusable code
# .gitlab-ci.yml
stages:
- deploy
reusable_deploy_job:
stage: deploy
rules:
- if: '$CI_PIPELINE_SOURCE == "pipeline"' # run only if triggered by a pipeline
script:
- bash ./src/run.sh $UPSTREAM_CUSTOM_VARIABLE
GroupY/ProjectB - job that will reuse a code
# .gitlab-ci.yml
stages:
- deploy
deploy_job:
stage: deploy
variables:
UPSTREAM_CUSTOM_VARIABLE: CUSTOM_VARIABLE # pass this variable to downstream job
trigger: groupx/projecta
Upvotes: -1
Reputation: 202
If you have access project A and B, you can use multi-project pipelines. You trigger a pipeline in project A from project B.
In project A, you clone project B and run your script.
job 1:
variables:
PROJECT_PATH: "$CI_PROJECT_PATH"
RELEASE_BRANCH: "$CI_COMMIT_BRANCH"
trigger:
project: project-a
strategy: depend
job 2:
rules:
- if: '$CI_PIPELINE_SOURCE == "pipeline" && $PROJECT_PATH && $RELEASE_BRANCH'
script:
- git clone -b "${RELEASE_BRANCH}" --depth 50 https://gitlab-ci-token:${CI_JOB_TOKEN}@${CI_SERVER_HOST}/${PROJECT_PATH}.git $(basename ${PROJECT_PATH})
- cd $(basename ${PROJECT_PATH})
- chmod +x ../.run.sh
- source ../.run.sh
Upvotes: 17