Rod Ramírez
Rod Ramírez

Reputation: 1368

Azure .yaml pipeline file strategy in branching

I have a git branching strategy as follow:

branches:

experimental-feature integrates to develop

develop integrates to master

Where in each branch I have a file named azure-pipelines.yaml that has the rules for a pipeline build, each of which is different from branch to branch, because each file has a property name trigger that matches the branchname.

I.E: master branch has a azure-pipelines.yml that has a trigger property named "master", because for each change on master branch, a pipeline would be fire. Same for develop and experimenta-feature. So far so good.

Now, I don't understand why when I create a pull request from develop to master git somehow doesn't recognize the changes between each azure-pipeline.yaml (develop and master). Which is good, because the azure-pipeline.yml in develop would always overwrite the master azure-pipeline.yaml and I don't want that.

But, when I integrate via pull request the changes from experimental-feature to develop git does recognize the changes between files which I don't want.

Can someone enlighten me here? I can't find in Microsoft documentation how this work either.

Upvotes: 1

Views: 2222

Answers (1)

Iqan Shaikh
Iqan Shaikh

Reputation: 277

You could use trigger and conditions for stage to run specific stages for specific branch only.

I would suggest to have a single pipeline file azure-pipeline.yml for all branches and all environments. You can then create templates for Jobs i.e. Build Job, Deploy to Non-prod job, etc.

Like this:

enter image description here

Pipeline:

trigger:
- master
- dev

pr:
  branches:
    include:
      - master
      - dev

variables:
  - name: vmImage
    value: 'ubuntu-latest'

stages:
  - stage: Build
    displayName: Build stage
    jobs:
    - job: BuildJob
      pool:
        vmImage: $(vmImage)
      steps:
      - template: Jobs/build.yml
 
  - stage: NonProd
    displayName: Deploy non prod stage
    condition: and(succeeded(), in(variables['build.sourceBranch'], 'refs/heads/master', 'refs/heads/dev'))
    jobs:
    - deployment: DeploymentJob1
      pool:
        vmImage: $(vmImage)
      environment: non-prod
      variables:
        - template: Variables/non-prod.yml
      strategy:
        runOnce:
          deploy:
            steps:
            - template: Jobs/deploy.yml

  - stage: Prod
    displayName: Deploy prod stage
    condition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/master'))
    jobs:
    - deployment: DeploymentJob2
      pool:
        vmImage: $(vmImage)
      environment: prod
      variables:
        - template: Variables/prod.yml
      strategy:
        runOnce:
          deploy:
            steps:
            - template: Jobs/deploy.yml

Triggers

  1. Ensures direct check-in build is triggered for dev and master branches
trigger:
- master
- dev

  1. Ensures PR to dev and master branches (target) triggers build
pr:
  branches:
    include:
      - master
      - dev

Conditions

  1. Ensures this stage (non-prod deployment) only runs for Master and Dev and not for any other branch
condition: and(succeeded(), in(variables['build.sourceBranch'], 'refs/heads/master', 'refs/heads/dev'))
  1. Ensures this stage (prod deployment) only runs for Master branch
condition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/master'))

Similarly you can mix and match things to better suit your purpose.

Please have a look at repository for reference: https://github.com/iqans/azure-pipeline-demo

Upvotes: 5

Related Questions