Wesley Bland
Wesley Bland

Reputation: 9062

Jenkins Pipeline / Groovy Script with undefined variables

I'm trying to convert my large multi-config Jenkins job over to pipeline syntax so I can, among other things, split it across multiple nodes and combine my multiple stages into one job. Here's the part where I'm seeing trouble:

def build_test_configs = [:]
def compilers = ['gnu', 'icc']
def configs = ['debug', 'default', 'opt']

for (int i = 0; i < configs.size(); i++) {
    for (int j = 0; j < compilers.size(); j++) {
        def node_name = ""
        if ("${compilers[j]}" == "gnu") {
            node_name = "node001"
        } else {
            node_name = "node002"
        }
        build_test_configs["${node_name} ${configs[i]}"] = {
            node ("${node_name}") {
                stage("Build Test ${node_name} ${compilers[j]} ${configs[i]}") {
                    unstash "${node_name}-tarball"
                    sh "$HOME/software/jenkins_scripts/nightly.sh ${configs[i]} ${compilers[j]} yes $WORKSPACE"
                }
            }
        }
    }
}

parallel build_test_configs

My problem is that ${compilers[j] and $configs[i] are undefined when I get to the part where I'm trying to build up a dictionary of build_test_configs on line 13. It would appear that the check on line 8 is working just fine.

Update

I don't have an error message per se. The script doesn't produce any runtime errors. The unexpected output is that the names of the stages are:

And the nightly.sh script is getting passed null parameters as well.

Upvotes: 3

Views: 8036

Answers (1)

Hugues M.
Hugues M.

Reputation: 20467

I think this is the expected behavior: Jenkins Pipeline scripts are written in Groovy but what is actually executed is a transformation of that (the term they use is "continuation-passing style transformation"). For example, some parts will run on the master, some on the slave nodes.

This involves a lot of magic that flies way above my head, but at our level it means we have to work with constraints in the syntax & constructs we use.

See the "fundamentals" paragraph of this article:

To understand Pipeline behavior you must understand a few points about how it executes.

  1. Except for the steps themselves, all of the Pipeline logic, the Groovy conditionals, loops, etc execute on the master. Whether simple or complex! Even inside a node block!
  2. Steps may use executors to do work where appropriate, but each step has a small on-master overhead too.
  3. Pipeline code is written as Groovy but the execution model is radically transformed at compile-time to Continuation Passing Style (CPS).
  4. This transformation provides valuable safety and durability guarantees for Pipelines, but it comes with trade-offs: Steps can invoke Java and execute fast and efficiently, but Groovy is much slower to run than normal. Groovy logic requires far more memory, because an object-based syntax/block tree is kept in memory.
  5. Pipelines persist the program and its state frequently to be able to survive failure of the master.

Also see JENKINS-41335 discussing support of variables across the script.

Edit: ah, yes, as pointed in the comments, the new declarative model allows to define an environment with variables that would be passed the way you need... Don't know how to do that in scripted pipeline without JENKINS-41335 but it seems further evolutions will now happen in declarative land :/

Upvotes: 2

Related Questions