bitsnbytes
bitsnbytes

Reputation: 1

GitLab CI Regex not running job

I'm having an issue where Gitlab-CI will not run my job no matter what I try. I am using a regex expression to match with a variable I have defined within my pipeline (It's read from a file via python script), the echo shows the variable is defined, however the regex is not matching.

I am looking for 'dev' anywhere within the variable. If this is the case, it should run the build job. A snippet of the gitlab-ci.yml file is below:

before_script:
  - "docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY"
  - "echo $CI_COMMIT_REF_SLUG"
  - NEW_TAG=$(python3 read_current_version.py)   
  - export NEW_TAG=$(python3 read_current_version.py)
  - "echo $NEW_TAG"

build:
  stage: "build"
  tags: 
    - "shell"
  script:
    - "cd ./app"
    - "echo BUILDING $CI_REGISTRY_IMAGE_FLASK:$NEW_TAG"
    - "docker build -t $CI_REGISTRY_IMAGE/$CI_REGISTRY_IMAGE_FLASK:$NEW_TAG -f ./Dockerfile ."
    - "echo PUSHING $CI_REGISTRY_IMAGE_FLASK:$NEW_TAG"
    - "docker push $CI_REGISTRY_IMAGE/$CI_REGISTRY_IMAGE_FLASK:$NEW_TAG"
  rules:
    - if: $NEW_TAG =~ /dev/

Any help would be much appreciated!

I expected the job to run when a new commit is pushed, however it doesn't run and is skipped meaning that the match case isn't matched.

Upvotes: 0

Views: 336

Answers (1)

bhito
bhito

Reputation: 2673

Your build job is not being executed, because when GitLab is evaluating the condition that you've set up on your rules, the variable is empty. Even if you're using a global before_script, this will run as part of the build job but right before the script.

That part of the code will never be executed because GitLab decides to add the job to the pipeline based on your rules condition, and at that point in time the variable $NEW_TAG is empty.

To overcome that, you can create a dummy job that runs before build and generates the variable so then you pass it to this job via an artifact. Here you have an example from the docs on how to do so.

Another option would be using downstream pipelines but I think that would be too much of an effort for your use-case.

Upvotes: 0

Related Questions