Reputation: 71
I'm using Jenkins to automate parallel JMeter tests. This is set up as two separate Jenkins pipeline jobs, the parent job and the child job.
The child job takes a series of parameters and executes the JMeter test against the target service. This is working and archives four CSV's and an XML file on each build.
The parent job executes the child job multiple times in parallel on different nodes. Currently it executes it twice in testing, but is intended to eventually spawn 10 or 20 child jobs at a time. The parallel execution works, and the child job records two builds each time the parent is executed, with their artifacts archived.
The problem is how to configure the Copy Artifacts plugin to retrieve the artifacts from the child jobs so they can be archived on the parent job.
ParentBuildTag
, of type Build selector for Copy Artifact
. The Permission to Copy Artifact
checkbox is checked, with the Projects to allow copy artifacts
field set to *
.post {
always {
script {
print "buildParameter('${BUILD_TAG}') == " + buildParameter("${BUILD_TAG}")
copyArtifacts optional: false, projectName: 'CC_DGN_Test', selector: buildParameter("${BUILD_TAG}")
archiveArtifacts "*.xml"
}
cleanWs()
}
}
The build parameter is being populated to the child job like so:
stage('Node 2') {
agent { node { label 'PIPELINE' } }
steps {
script {
node2 = build job: 'CC_DGN_Test',
parameters: [
string(name: 'dummy', value: "2"),
string(name: 'ParentBuildTag', value: "${BUILD_TAG}"),
string(name: 'Labels', value: "JMETER"),
...additional parameters snipped...
]
}
}
}
The console log shows an error:
Error when executing always post condition:
hudson.AbortException: Unable to find a build for artifact copy from: CC_DGN_Test
at hudson.plugins.copyartifact.CopyArtifact.perform(CopyArtifact.java:412)
at org.jenkinsci.plugins.workflow.steps.CoreStep$Execution.run(CoreStep.java:80)
at org.jenkinsci.plugins.workflow.steps.CoreStep$Execution.run(CoreStep.java:67)
at org.jenkinsci.plugins.workflow.steps.SynchronousNonBlockingStepExecution.lambda$start$0(SynchronousNonBlockingStepExecution.java:47)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Nothing is copied to the parent. The build tag is correctly printed to the console log (from the print statement in post{}).
08:18:52 buildParameter('jenkins-CC_DGN_TrickleTest-45') == @buildParameter(<anonymous>=jenkins-CC_DGN_TrickleTest-45)
This approach looks promising, but I think there's a syntax issue... I think I should be telling the copyArtifacts plugin to use ParentBuildTag parameter where the value is 'jenkins-CC_DGN_TrickleTest-45', but I haven't found an example describing the syntax.
stage('Node 2') {
agent { node { label 'PIPELINE' } }
steps {
script {
node2 = build job: 'CC_DGN_Test',
parameters: [
string(name: 'dummy', value: "2"),
string(name: 'ParentBuildTag', value: "${BUILD_TAG}"),
string(name: 'Labels', value: "JMETER"),
...additional parameters snipped...
]
print "Build number (node 2) = " + node2.number //prints build number to console e.g. "Build number (node 2) = 102"
copyArtifacts optional: false, filter: '*.xml, *.csv', fingerprintArtifacts: true, projectName: 'CC_DGN_Test', selector: specific(node2.number)
}
}
}
The build numbers are correctly printed to the console log, but no errors are logged, and nothing is copied.
properties([parameters([
[$class: 'BuildSelectorParameter',
defaultSelector: upstream(fallbackToLastSuccessful: true),
description: '',
name: 'ParentBuildTag']])
])
copyArtifacts(
projectName: 'CC_DGN_Test',
selector: [
class: 'ParameterizedBuildSelector',
parameterName: 'ParentBuildTag'
]
);
Again, my suspicion is that I need to be telling it what value to use for the ParentBuildTag, but the syntax example I borrowed this from didn't show how to do that. The 'upstream...' part was just something I copied from the example, not something I think I need, but it seemed harmless to include in my testing.
stash includes: '*.xml', name: 'node1xml'
unstash 'node1xml'
Here is the current parent job configuration, snipped in places for brevity:
pipeline {
agent { node { label 'PIPELINE' } }
options {
timeout(time: 1, unit: 'HOURS')
buildDiscarder(logRotator(numToKeepStr: '100'))
timestamps()
}
environment {
node1 = ""
node2 = ""
}
stages {
stage('Clean Up') {
steps {
cleanWs()
}
}
stage('Test') {
parallel {
stage('Node 1') {
agent { node { label 'PIPELINE' } }
steps {
script {
node1 = build job: 'CC_DGN_Test',
parameters: [
string(name: 'dummy', value: "1"),
string(name: 'ParentBuildTag', value: "${BUILD_TAG}"),
string(name: 'Labels', value: "JMETER"),
...additional parameters snipped...
]
}
}
}
stage('Node 2') {
agent { node { label 'PIPELINE' } }
steps {
script {
node2 = build job: 'CC_DGN_Test',
parameters: [
string(name: 'dummy', value: "2"),
string(name: 'ParentBuildTag', value: "${BUILD_TAG}"),
string(name: 'Labels', value: "JMETER"),
...additional parameters snipped...
]
}
}
}
}
}
}
post {
always {
script {
copyArtifacts optional: false, projectName: 'CC_DGN_Test', selector: buildParameter("${BUILD_TAG}")
archiveArtifacts "*.xml"
}
cleanWs()
}
}
}
My goal is for the parent job to contain a total of eight CSV's and two XML's after the job completes, based on current configuration, but nothing is archived with the parent job currently. Where am I going wrong with the copyArtifact syntax?
Upvotes: 5
Views: 3103
Reputation: 2850
Your point 2. approach is the correct one. You just need to convert node2.number
to a String:
selector: specific("${node2.number}")
You could also call the child jobs in a method. Here is an example script:
#! groovy
pipeline {
environment {
childJobName = "Testing/MyChildJob"
}
stages {
stage('Child Jobs') {
parallel {
stage('ChildJob1') {
steps {
runJob(childJobName, '@tag1 @tag2', 'job1')
}
}
stage('ChildJob2') {
steps {
runJob(childJobName, '@tag3 @tag4', 'job2')
}
}
}
}
}
post {
cleanup{
cleanWs()
}
}
}
def runJob(String jobName, String tags, String rootReportDir) {
def childJob = build job: jobName, propagate: false, wait: true, parameters: [string(name: 'TAGS', value: tags)]
copyArtifacts filter: "report.html", projectName: jobName, selector: specific("${childJob.number}"), target: rootReportDir
archiveArtifacts artifacts: "${rootReportDir}/report.html"
if (childJob.result == "FAILURE") {
bat "exit 1"
}
}
In this example the child jobs are all the same Jenkins job. The parent job passes different parameters to each of them.
The report file produced by the child job is copied into the rootReportDir in the parent job. The rootReportDir should be unique for each child job so that each report has a unique path when archived to the parent job.
Upvotes: 2