rainer
rainer

Reputation: 7109

Run Parts of a Pipeline as Separate Job

We're considering using the Jenkins Pipeline plugin for a rather complex project consisting of several deliveries that need to be build using different tools (on different machines) before being merged. Still, it seems to be easy enough to do a complete build with a single Jenkinsfile, and I like the automatic discovery of git branches that comes with Pipeline.

However, at this point, we have jobs for each of the deliveries and use a build-flow based "meta" job to orchestrate the individual jobs. The nice thing about this is that it also allows starting just one individual job if only small changes were made, just to see whether this delivery still compiles.

To emulate this, some ideas came to mind:

Are those viable options, or is there a better one?

Upvotes: 8

Views: 11670

Answers (2)

Pavel S.
Pavel S.

Reputation: 1346

As an expansion of the previous answer, I would propose something like that:

def stageIf(String name, Closure body) {
if (params.firstStage <= name && params.lastStage >= name) {
    stage(name, body)
} else {
    stage(name) {
        echo "Stage skipped: $name"
    }
}
}

node('linux') {
properties([
        parameters([
                choiceParam(
                        name: 'firstStage',
                        choices: '1.Build\n' +
                                '2.Docker\n' +
                                '3.Deploy',
                        description: 'First stage to start',
                        defaultValue: '1.Build',
                ),
                choiceParam(
                        name: 'lastStage',
                        choices: '3.Deploy\n' +
                                '2.Docker\n' +
                                '1.Build',
                        description: 'Last stage to start',
                        defaultValue: '3.Deploy',
                ),
        ])
])

stageIf('1.Build') {
  // ... 
}
stageIf('3.Deploy') {
  // ... 
}
}

Not as perfect as I wish but at least its working.

Upvotes: 1

olenz
olenz

Reputation: 577

What you could do is to write a pipelining script that has has "if"-guards around the single stages, like this:

stage "s1"
if (theStage in ["s1","all"]) {
    sleep 2
}

stage "s2"
if (theStage in ["s2", "all"]) {
    sleep 2
}

stage "s3"
if (theStage in ["s3", "all"]) {
    sleep 2
}

Then you can make a "main" job that uses this script and runs all stages at once by setting the parameter "theStage" to "all". This job will collect the statistics when all stages are run at once and give you useful estimation times.

Furthermore, you can make a "partial run" job that uses this script and that is parametrized with the stage that you want to run. The estimation will not be very useful, though.

Note that I put the stage itself to the main script and put only the execution code into the conditional, as suggested by Martin Ba. This makes sure that the visualization of the job is more reliable

Upvotes: 6

Related Questions