Reputation: 3
We have several pipeline jobs with the same structure and behavior: update an ansible repository, execute a playbook with some parameters whose value depends on the environment and test with inspec the execution. We've tried to abstract the general behavior in an external file.
JenkinsfileAnsible:
#!/usr/bin/env groovy
import groovy.json.JsonOutput
node {
}
def executePlaybook(environment){
pipeline{
agent any
stages{
stage('Update repository'){
...
}
stage('Esecute playbook'){
...
}
stage('Execute tests'){
...
}
}
}
}
return this
Each environment will have a specific Jenkinsfile that sets the parameters and load the general Jenkinsfile in order to execute the pipeline.
JenkinsfileDev:
#!/usr/bin/env groovy
import groovy.json.JsonOutput
node{
checkout scm
def ansible = load "../JenkinsfileAnsible"
ansible.execute_playbook("development")
}
The code has been simplified, and we do not have trouble loading the external file or executing defined functions. The problem is that we wanted to define the pipeline inside the general file, as is the same for every environment, and just invoke it, but we can't make it work.
We've faced errors as Jenkins can't recognize the pipeline definition in the external file.
Any advice? Is not possible to do? Is there something we are missing?
Upvotes: 0
Views: 3029
Reputation: 1072
You can use Jenkins Pipeline Shared Libraries from https://jenkins.io/doc/book/pipeline/shared-libraries/
The approach would be to have a Jenkinsfile like this:
@Library('your-pipeline') _
thePipeline([param1: val1])
And in the Pipeline Library code, something like:
def call(Map<String, String> pipelineConfig) {
pipeline{
agent any
stages{
stage('Update repository'){
//You can use your pipelineConfig param1 etc.
}
stage('Esecute playbook'){
...
}
stage('Execute tests'){
...
}
}
}
You can use the config params for different environments or even create different pipelines for the different environments.
Hope it helps.
Upvotes: 2