Steve Stokes
Steve Stokes

Reputation: 1230

Bitbucket Pipelines - steps - docker - cant find image

I'm building my pipline to create a docker image, then push it to AWS. I have it broken into steps, and in Bitbucket, you have to tell it what artifacts to share between them. I have a feeling this is a simple bug, but I just cannot figure it out.

It's failing at 'docker tag' in step 4 with:

docker tag $DOCKER_PROJECT_NAME:v.$BITBUCKET_BUILD_NUMBER $AWS_REGISTRY_URL/$DOCKER_PROJECT_NAME:v.$BITBUCKET_BUILD_NUMBER
Error response from daemon: No such image: projectname:v.11

Basically it cannot find the docker image created...

Here's my pipeline script (some of it simplified)

image: atlassian/default-image:latest

options:
  docker: true

pipelines:
  branches:
    dev:          
      - step:
         name: 1. Install dotnet
         script:
           # Do things

      - step:
         name: 2. Install AWS CLI
         script:
           # Do some more things

      - step:
         name: 3. Build Docker Image
         script:
           - export DOCKER_PROJECT_NAME=projectname

           - docker build -t $DOCKER_PROJECT_NAME:latest -t $DOCKER_PROJECT_NAME:v.$BITBUCKET_BUILD_NUMBER .
         artifacts:
           - ./**

      - step:
         name: 4. Push Docker Image to AWS
         script:
           # Tag and push my docker image to ECR
           - export DOCKER_PROJECT_NAME=projectname
           - docker tag $DOCKER_PROJECT_NAME:v.$BITBUCKET_BUILD_NUMBER $AWS_REGISTRY_URL/$DOCKER_PROJECT_NAME:v.$BITBUCKET_BUILD_NUMBER
           - docker push $AWS_REGISTRY_URL/$DOCKER_PROJECT_NAME:v.$BITBUCKET_BUILD_NUMBER

Now, I know this script works, but only if I remove all the steps. For whatever reason, step 4 doesn't have access to the docker image created in step 3. Any help is appreciated!

Upvotes: 1

Views: 3045

Answers (2)

phod
phod

Reputation: 536

The Docker image is not being passed from step 3 to step 4 as the Docker image is not stored in the build directory.

The simplest solution would be to combine all four of your steps into a single step as follows:

image: atlassian/default-image:latest

options:
  docker: true

pipelines:
  branches:
    dev:          
      - step:
         script:
           # Install dependencies
           - ./install-dot-net
           - ./install-aws-cli

           # Build the Docker image
           - export DOCKER_PROJECT_NAME=projectname
           - docker build -t $DOCKER_PROJECT_NAME:latest -t $DOCKER_PROJECT_NAME:v.$BITBUCKET_BUILD_NUMBER .

           # Tag and push the Docker image to ECR
           - export DOCKER_PROJECT_NAME=projectname
           - docker tag $DOCKER_PROJECT_NAME:v.$BITBUCKET_BUILD_NUMBER $AWS_REGISTRY_URL/$DOCKER_PROJECT_NAME:v.$BITBUCKET_BUILD_NUMBER
           - docker push $AWS_REGISTRY_URL/$DOCKER_PROJECT_NAME:v.$BITBUCKET_BUILD_NUMBER

Upvotes: 0

bert bruynooghe
bert bruynooghe

Reputation: 3093

Your docker images are not stored in the folder where you start the build, so they are not saved to the artefacts, and not available in the next step.

Even if they were (you could pack/unpack it through docker save), you would probably run against the size limits for artefacts, not to mention the time the time it takes to pack/unpack.

I guess you'd be better off if you created a Dockerfile in your project yourself, and combine step 1 & 2 there. Your bitbucket pipeline could then be based on a docker image that already contains the AWS-cli and uses docker as a service, and your one step would then consist of building your project's Dockerfile and uploading to AWS. This also lowers your dependency on bitbucket pipelines, as

Upvotes: 3

Related Questions