Reputation: 43
i have a simple pipeline. i was able to take the properties outside the pipeline block and run it successfully. when i try to move the properties block to either external groovy and import it with "load" or even use shared libraries the pipeline failed. is there a way to share a block of code outside the pipeline block?
here is my try with shared libraries that failed.
@Library("shared-library") _
properties()
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Building..'
}
}
stage('Test') {
steps {
echo 'Testing..'
}
}
stage('Deploy') {
steps {
echo 'Deploying....'
}
}
}
}
here is my try with import groovy that failed.
def shared_funcs = load "${env.WORKSPACE}/shared/@script/shared_funcs.groovy"
shared_funcs.properties()
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Building..'
}
}
stage('Test') {
steps {
echo 'Testing..'
}
}
stage('Deploy') {
steps {
echo 'Deploying....'
}
}
}
}
This will output "Required context class hudson.FilePath is missing Perhaps you forgot to surround the code with a step that provides this, such as: node"
shared_funcs.groovy
def call() {
properties([ parameters([
string( name: 'AWS_ACCESS_KEY_ID', defaultValue: ''),
string( name: 'AWS_SECRET_ACCESS_KEY', defaultValue: '')
]), pipelineTriggers([]) ])
}
Upvotes: 2
Views: 2403
Reputation: 36
Make sure your Global Pipeline Library has the right structure. https://www.jenkins.io/doc/book/pipeline/shared-libraries/#directory-structure
Make sure you have a file in the vars/ directory of that library. For your purpose it should be named shared_funcs.groovy and be structured as a function.
def call(){
properties([ parameters([
string( name: 'AWS_ACCESS_KEY_ID', defaultValue: ''),
string( name: 'AWS_SECRET_ACCESS_KEY', defaultValue: '')
]), pipelineTriggers([]) ])
}
You need to have it structured with the call()
piece for it to work.
Upvotes: 2