Scott Miller
Scott Miller

Reputation: 61

Jenkins pipelineJob DSL not interpreting variables in pipeline script

I'm trying to generate Jenkins pipelines using the pipelineJob function in the jobDSL pluging, but cannot pass parameters from the DSL to the pipeline script. I have several projects that use what is essentially the same Jenkinsfile, with differences only in a few steps. I'm trying to use the JobDSL plugin to generate these pipelines on the fly, with the values I want changed in them interpreted to match the parameters to the DSL.

I've tried just about every combination of string interpretation that I can in the pipeline script, as well as in the DSL, but cannot get Jenkins/groovy to interpret variables in the pipeline script.

I'm calling the job DSL in a pipeline step:

def projectName = "myProject"
def envs = ['DEV','QA','UAT']
def repositoryURL = 'myrepo.com'

jobDsl targets: ['jobs/*.groovy'].join('\n'), 
    additionalParameters: [
        project: projectName, 
        environments: envs, 
        repository: repositoryURL
    ],
    removedJobAction: 'DELETE',
    removedViewAction: 'DELETE'

The DSL is as follows:

pipelineJob("${project} pipeline") {
    displayName('Pipeline')
        definition {
            cps { 
                script(readFileFromWorkspace(pipeline.groovy))
            }
        }
    }

pipeline.groovy:

pipeline {
    agent any

    environment {
        REPO = repository
    }

    parameters {
        choice name: "ENVIRONMENT", choices: environments
    }

    stages {
        stage('Deploy') {
            steps {
                echo "Deploying ${env.REPO} to ${params.ENVIRONMENT}..."
            }
        }
    }
}

The variables that I pass in additionalParameters are interpreted in the jobDSL script; a pipeline with the correct name does get generated. The problem is that the variables are not passed to the pipeline script read from the workspace - the Jenkins configuration for the generated pipeline looks exactly the same as the file, without any interpretation on the variables.

I've made a number of attempts at getting the string to interpret, including a lot of variations of "${environments}", ${environments}, $environments, \$environments...I can't find any that work. I've also tried reading the file as a gstringImpl:

script("${readFileFromWorkspace(pipeline.groovy)}")

Does anyone have any ideas as to how I can make variables propagate down to the pipeline script? I know that I could just use a for loop to do string.replaceAll() on the script text, but that seems cumbersome; there's got to be a better way.

Upvotes: 3

Views: 7128

Answers (3)

Mayank Kapoor
Mayank Kapoor

Reputation: 41

You need to use the complete job name as a variable without the quotes. E.g., if JOBNAME is a parameter containing the entire job name:

pipelineJob(JOBNAME) {
    displayName('Pipeline')
        definition {
            cps { 
                script(readFileFromWorkspace(pipeline.groovy))
            }
        }
    }

Upvotes: 1

jpadams
jpadams

Reputation: 191

You could achieve what you're trying to do by defining environment variables in the pipelineJob and then using those variables in your pipeline.

They are a bit limited because environment variables are strings, but it should work for basic stuff

Ex.:

//job-dsl
pipelineJob('example') {
    environmentVariables {
        // these vars could be specified by parameters of this job
        env('repository', 'blah')
        env('environments', "a,b,c"]) //comma separated string
    }
    displayName('Pipeline')
        definition {
            cps { 
                script(readFileFromWorkspace(pipeline.groovy))
            }
        }
    }
}

And then in the pipeline:

//pipeline.groovy
pipeline {
    agent any

    environment {
        REPO = env.repository
    }

    parameters {
        choice name: "ENVIRONMENT", choices: env.environments.split(',') 
        //note the need to split the comma separated string above
    }
}

Upvotes: 1

Scott Miller
Scott Miller

Reputation: 61

I've come up with a way to make this work. It's not what I'd prefer, which is having the string contents of the file implicitly interpreted during job creation, but it does work; it just adds an extra step.

import groovy.text.SimpleTemplateEngine

def fileContents = readFileFromWorkspace "pipeline.groovy"

def engine = new SimpleTemplateEngine()
template = engine.createTemplate(fileContents).make(binding.getVariables()).toString()

pipelineJob("${project} pipeline") {
    displayName('Pipeline')
    definition {
        cps { 
            script(template)
        }
    }
}

This reads a file from your workspace, then uses it as a template with the binding variables. The other changes needed to make this work are escaping any variables used in your Jenkinsfile script, like \${VARIABLE} so that they are expanded at runtime, not at the time you build the job. Any variables you want to be expanded at job creation should be referenced as ${VARIABLE}.

Upvotes: 3

Related Questions