Reputation: 702
I'm new to using Jenkins....
I'm trying to automate the production of an image (to be stashed in a repo) using a declarative Jenkinsfile. I find the documentation to be confusing (at best). Simply put, how can I convert the following scripted example (from the docs)
node {
checkout scm
def customImage = docker.build("my-image:${env.BUILD_ID}")
customImage.push()
}
to a declarative Jenkinsfile....
Upvotes: 30
Views: 51356
Reputation: 4364
I cannot recommend the declarative syntax for building a Docker image because it seems that every important step requires falling back to the old scripting syntax. However, if you must, a hybrid approach seems to work.
First a detail about the scm step: when I defined the Jenkins "Pipeline script from SCM" project that fetches my Jenkinsfile with a declarative pipline from git, Jenkins cloned the repo as the first step in the pipeline even though I did not define a scm step.
For the build and push steps, I can only find solutions that are a hybrid of old-style scripted pipeline steps inside the new-style declarative syntax. For example, see gustavoapolinario's work at Medium:
https://medium.com/@gustavo.guss/jenkins-building-docker-image-and-sending-to-registry-64b84ea45ee9
which has this hybrid pipeline definition:
pipeline {
environment {
registry = "gustavoapolinario/docker-test"
registryCredential = 'dockerhub'
dockerImage = ''
}
agent any
stages {
stage('Cloning Git') {
steps {
git 'https://github.com/gustavoapolinario/microservices-node-example-todo-frontend.git'
}
}
stage('Building image') {
steps{
script {
dockerImage = docker.build registry + ":$BUILD_NUMBER"
}
}
}
stage('Deploy Image') {
steps{
script {
docker.withRegistry( '', registryCredential ) {
dockerImage.push()
}
}
}
}
stage('Remove Unused docker image') {
steps{
sh "docker rmi $registry:$BUILD_NUMBER"
}
}
}
}
Because the first step here is a clone, I think he built this example as a standalone pipeline project in Jenkins (not a Pipeline script from SCM project).
Upvotes: 1
Reputation: 119
If during build process some execution of scripts is required inside the container, the following approach can be used:
pipeline {
environment {
registry = 'fill_me'
registryCredentials = 'AWS'
awsRegion = 'us-east-1'
dockerImage = ''
}
options {
disableConcurrentBuilds()
}
agent none
stages {
stage('Build, run, push & upload') {
agent any
steps {
script {
dockerImage = docker.build registry + ":${BUILD_TAG}"
dockerImage.inside {
sh """
echo ${TEST_VAR} >> file.txt
"""
}
docker.withRegistry("https://" + registry, "ecr:" + awsRegion + ":" + registryCredentials) {
dockerImage.push()
}
}
}
}
Docker pipeline plugin is required.
Upvotes: 0
Reputation: 4100
I'm using following approach:
steps {
withDockerRegistry([ credentialsId: "<CREDENTIALS_ID>", url: "<PRIVATE_REGISTRY_URL>" ]) {
// following commands will be executed within logged docker registry
sh 'docker push <image>'
}
}
Where:
Upvotes: 8
Reputation: 3314
You can use scripted pipeline blocks in a declarative pipeline as a workaround
pipeline {
agent any
stages {
stage('Build image') {
steps {
echo 'Starting to build docker image'
script {
def customImage = docker.build("my-image:${env.BUILD_ID}")
customImage.push()
}
}
}
}
}
Upvotes: 38