Behrang Saeedzadeh
Behrang Saeedzadeh

Reputation: 47913

How to make sure list of parameters are updated before running a Jenkins pipeline?

A Jenkins pipeline project is configured to fetch its Jenkinsfile from a Git repo:

Pipeline Def

If I change the list of parameters, for example, from:

properties([
        parameters([
                string(name: 'FOO', description: 'Choose foo')
        ])
])

to:

properties([
        parameters([
                string(name: 'FOO', description: 'Choose foo'),
                string(name: 'BAR', description: 'Choose bar')
        ])
])

And run the build, the first run does not show the newly added BAR parameter:

Parameter list not updated

As the updated Jenkins file expects the BAR parameter to be present, this causes the first build after the change to fail as the user is not presented with an input to enter this value.

Is there a way to prevent this? To make sure the Jenkinsfile is up-to-date before showing the parameter entry page?

Upvotes: 32

Views: 22910

Answers (7)

Jon Daley
Jon Daley

Reputation: 31

The way I do it, which I find more convenient (at least when first developing a pipeline) is to have the job triggered by commits to github.

triggers {
    githubPush()
}

And then in my parameters stage, I have:

if(BUILD_USER_ID == "scmChange"){
  cleanup=false
  abort(false, "Success - reloaded configuration")
}
else if(Models =~ ~/.*ERROR.*/){
  cleanup=false
  abort(true, "", "This is not an error if this was an automatic build from changing the GUI multi-branch config")
}

Where "Models" is a parameter that is known to always be in the job configuration no matter what edits might happen in the future.

Kind of hacky, but it works.

The "scmChange" (requires the build user vars plugin) part runs on normal github commits, and the second/else section is needed when scanning for branches.

Note: the abort() function is a library wrapper that I wrote to make the build title, description, etc. have a nicer color and report of why it aborted (the first parameter controls the color).

Upvotes: 0

baumato
baumato

Reputation: 378

The idea of following approach is to define the default value once and do not fail the pipeline if the parameter is not there yet - which is the case on the very first run.

  • Define the default value above the pipeline.
  • Use it in your parameter definition
  • override the parameter in the environment section and use the environment variable throughout the pipeline

Example:

NEW_PARAM_DEFAULT = 'the default value'

pipeline {

  parameters {
    string(
        name: 'NEW_PARAM',
        description: 'Safely introduce a new parameter without a failure on the first run.',
        defaultValue: NEW_PARAM_DEFAULT)
  }
  
  environment {
     NEW_PARAM_VALUE = params.getOrDefault('NEW_PARAM', NEW_PARAM_DEFAULT)
     // safely use NEW_PARAM_VALUE everywhere
  }

  stages {
    ...
  }
}

Upvotes: 0

Gary Trickett
Gary Trickett

Reputation: 1

try..

    parameters {
        string(name: 'GRADLE_ARGS', defaultValue: '--console=plain', description: 'Gradle arguments')
    }

    environment{
        GRADLE_ARGS = "${params.GRADLE_ARGS}"
    }

Upvotes: -1

julian-alarcon
julian-alarcon

Reputation: 324

An issue was reported a few years ago in Jenkins related with this issue https://issues.jenkins-ci.org/browse/JENKINS-41929

Still open, so no elegant solution yet.

Upvotes: 2

Mohl
Mohl

Reputation: 415

The only solution to this problem afaik is to manually add an "skip_run" boolean parameter, than add a when{} clause to every stage of the job.

    properties([
        parameters([
                BooleanParameter(name: 'skip_run', description: 'Skips all stages. Used to update parameters in case of changes.', default: False)
        ])
    ])

...

stage('Doing Stuff') {
        when {
            expression { return params.skip_run ==~ /(?i)(N|NO|F|FALSE|OFF|STOP)/ }
        }
        steps {
            ...
        }
    }

This is, of course, very prone to error.

Alternatively, you could add a single stage as the very beginning of the pipeline and fail the build on purpose.

stage('Update Build Info only') {
        when {
            expression { return params.skip_run ==~ /(?i)(Y|YES|T|TRUE|ON|RUN)/ }
        }
        steps {
            error("This was done deliberately to update the build info.")
        }
    }

UPDATE: Thanks to Abort current build from pipeline in Jenkins, i came up with this solution:

To prevent the build from actually appearing red, you could wrap this with a try - catch and exit the build gracefully.

final updateOnly = 'updateOnly'     
try {
      stage('Update Build Info only') {
            when {
                expression { return params.skip_run ==~ /(?i)(Y|YES|T|TRUE|ON|RUN)/ }
            }
            steps {
                error(updateOnly)
            }
        }
...
//other stages here
...
    } catch (e) {
      if (e.message == updateOnly) {
        currentBuild.result = 'ABORTED'
        echo('Skipping the Job to update the build info')
        // return here instead of throwing error to keep the build "green"
        return
      }
      // normal error handling
      throw e
    }

Upvotes: 4

dolphy
dolphy

Reputation: 6498

Short answer: No. It would be nice if there was some facility for parsing and processing the Jenkinsfile separate from the build, but there's not.

Jenkins does not know about the new parameters until it retrieves, parses, and runs the Jenkinsfile, and the only way to do that is to...run a build.

In effect, the build history will always be "one run behind" the Jenkinsfile; when you change something in the Jenkinsfile, the next build will run with the "old" Jenkinsfile, but pick up and process the new Jenkinsfile for the build after that.

Upvotes: 24

ctran
ctran

Reputation: 31

I have a function that skips the build unless the job has all the required parameters, something like:

if (job.hasParameters(['FOO', 'BAR'])) {
    // pipeline code
}

Upvotes: 3

Related Questions