khmarbaise
khmarbaise

Reputation: 97359

Get parameter of an other job in Pipeline

I have two jobs. The first one is triggered manually with some parameters (for example a number which is needed). I have a second job which uses a paramter to select a particular build of the previous job (run parameter).

Now I need to get the parameters from the first job (in this case the number) which I need for the second job as well. I want to prevent having the user to give the number parameter a second time.

In the meantime I have found a way to extract the parameters from the first job via:

@NonCPS
def getParameters(def item) {
    def result = ""
    def p = item?.actions.find{ it -> it instanceof ParametersAction }?.parameters
    p.each { it ->
        echo "parameter ${it.name}: ${it.value}"
        if (it.name.equals("NUMBER")) {
            result = it.value.toString()
        }
    }
    return result
}
...

node (..) {


    def item = hudson.model.Hudson.instance.getItem("${SELECTED_JOBNAME}")
    def number = Integer.parseInt("${SELECTED_NUMBER}")
    def x = item.getBuildByNumber(number)

    def newNumber = getParameters(x)

The problem which arises is getting the following:

parameter NUMBER: 16
[Pipeline] echo
org.jenkinsci.plugins.workflow.job.WorkflowJob@1776388d[XX-YY]
[Pipeline] echo
XX-YY #48
[Pipeline] echo
newNumber: 16
[Pipeline] stage
[Pipeline] { (First)
[Pipeline] }
[Pipeline] }
[Pipeline] End of Pipeline
java.io.NotSerializableException: org.jenkinsci.plugins.workflow.job.WorkflowJob
    at org.jboss.marshalling.river.RiverMarshaller.doWriteObject(RiverMarshaller.java:860)
    at org.jboss.marshalling.river.BlockMarshaller.doWriteObject(BlockMarshaller.java:65)
    at org.jboss.marshalling.river.BlockMarshaller.writeObject(BlockMarshaller.java:56)
    at org.jboss.marshalling.MarshallerObjectOutputStream.writeObjectOverride(MarshallerObjectOutputStream.java:50)
    at org.jboss.marshalling.river.RiverObjectOutputStream.writeObjectOverride(RiverObjectOutputStream.java:179)
    at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:343)
    at java.util.HashMap.writeObject(HashMap.java:1129)

I have already tried to use @NonCPS at serveral locations

node (..) {

    @NonCPS
    def item = hudson.model.Hudson.instance.getItem("${SELECTED_JOBNAME}")
    @NonCPS
    def number = Integer.parseInt("${SELECTED_NUMBER}")
    @NonCPS
    def x = item.getBuildByNumber(number)
    @NonCPS
    def newNumber = getParameters(x)

but with no luck. Does someone has an idea how to solve this issue? Maybe another way to get the parameters ?

Upvotes: 8

Views: 5142

Answers (3)

pez
pez

Reputation: 21

Your solution looks OK. The exception you are getting could be avoided if you wipe out the value of variable item before you leave the node {} section.

The reason for this is that jenkins may need to copy the variables defined in individual node sections/stages to another worker. For that reason every variable you leave defined needs to be serializable which WorkflowJob wasn't in older versions of jenkins, but it works in jenkins 2.319.2.

btw. following pipeline fetched the params from upstream job successfully:

def getUpstreamParameters() {
    def params = [:]
    try {
        def cause = currentBuild.rawBuild.getCause(hudson.model.Cause$UpstreamCause)
        def job_name = cause?.upstreamProject
        def job_id = cause?.upstreamBuild
        def upstream_job = Jenkins.getInstance().getItemByFullName(job_name).getBuildByNumber(job_id)
        def param_list = upstream_job.actions.find{ a -> a instanceof ParametersAction }?.parameters

        param_list.each { p ->
              echo "upstream parameter found: ${p.name} = ${p.value}"
              params[p.name] = p.value
        }
    } catch(NullPointerException ex) {
        echo "WARNING: this script is expected to be triggered by upstream_job."
    }

    return params
}

pipeline {
    agent any

    stages {
        stage('Print Upstream Parameters') {
            steps {
                script {
                    def params = getUpstreamParameters()
                    echo params.get('FMRI', 'default_fmri_here')
                }
            }
        }
    }
}

Upvotes: 2

mahes wari
mahes wari

Reputation: 91

Try to call the second job in post -build action of first job configuration enter image description here

Upvotes: 0

Kamil Roman
Kamil Roman

Reputation: 1071

Kind-of workaround, but in the source build you can produce an additional artifact containing the needed parameter values, e.g. Java properties file. Then you can copy this artifact to your pipeline and extract the values

Upvotes: 0

Related Questions