Reputation: 11540
I have a Jenkins Job DSL seed job that calls out to a couple of pipeline jobs e.g.
pipelineJob("job1") {
definition {
cps {
script(readFileFromWorkspace('job1.groovy'))
}
parameters {
choiceParam('ENV', ['dev', 'prod'], 'Build Environment')
}
}
}
pipelineJob("job2") {
definition {
cps {
script(readFileFromWorkspace('job2.groovy'))
}
parameters {
choiceParam('ENV', ['dev', 'prod'], 'Build Environment')
}
}
}
job1.groovy and job2.groovy are standard Jenkinsfile style pipelines.
I want to pass a couple of common maps into these jobs. These contains things that may vary between environments, e.g. target servers, credential names.
Something like:
def SERVERS_MAP = [
'prod': [
'prod-server1',
'prod-server2',
],
'dev': [
'dev-server1',
'dev-server2',
],
]
Can I define a map in my seed job that I can then pass and access as a map in my pipeline jobs?
Upvotes: 4
Views: 9921
Reputation: 79
You can use string parameter pass the Map val, downstream read it as json format.
timestamps{
node("sse_lab_CI_076"){ //${execNode}
currentBuild.description="${env.NODE_NAME};"
stage("-- regression execute --"){
def test_map =
"""
{
"gerrit_patchset_commit": "aad5fce",
"build_cpu_x86_ubuntu": [
"centos_compatible_build_test",
"gdb_compatible_build_test",
"visual_profiler_compatible_build_test"
],
}
"""
build(job: 'tops_regression_down',
parameters: [string(name: 'UPSTREAM_JOB_NAME',
value: "${env.JOB_BASE_NAME}"),
string(name: 'UPSTREAM_BUILD_NUM',
value: "${env.BUILD_NUMBER}"),
string(name: 'MAP_PARAM',
value: "${test_map}"),
],
propagate: true,
wait: true)
}
}
}
timestamps{
node("sse_lab_inspur_076"){ //${execNode}
currentBuild.description="${env.NODE_NAME};"
stage('--in precondition--'){
dir('./'){
cleanWs()
println("hello world")
println("${env.MAP_PARAM}")
Map result_json = readJSON(text: "${env.MAP_PARAM}")
println(result_json)
}
}
}
}
Upvotes: 0
Reputation: 11540
I've come up with a hacky workaround using the pipeline-utility-steps plugin.
Essentially I pass my data maps around as JSON.
So my seed job might contain:
def SERVERS_MAP = '''
{
"prod": [
"prod-server1",
"prod-server2"
],
"dev": [
"dev-server1",
"dev-server2"
]
}
'''
pipelineJob("job1") {
definition {
cps {
script(readFileFromWorkspace('job1.groovy'))
}
parameters {
choiceParam('ENV', ['dev', 'prod'], 'Build Environment')
stringParam('SERVERS_MAP', "${SERVERS_MAP}", "")
}
}
}
and my pipeline would contain something like:
def serversMap = readJSON text: SERVERS_MAP
def targetServers = serversMap["${ENV}"]
targetServers.each { server ->
echo server
}
I could also extract these variables into a JSON file and read them from there.
Although it works, it feels wrong somehow.
Upvotes: 5