Jez
Jez

Reputation: 30081

How to make a Docker container available to a task in Azure Pipelines?

I want to run my integration tests against a local Docker-hosted SQL Server 2019 database inside an Azure pipeline. I'm trying to copy over sqlpackage to this Docker image so I can restore a .bacpac each time in order to setup the database before the tests. Here's the YAML config I have so far:

azure-pipelines.yml:

resources:
  containers:
  - container: 'sqlserver_linux_container_for_integration_tests'
    image: 'mcr.microsoft.com/mssql/server:2019-CU5-ubuntu-16.04'
    ports:
    - 1433:1433/tcp
    env:
      ACCEPT_EULA: 'Y'
      SA_PASSWORD: 'P@ssW0rd!'

trigger:
- azure-pipelines

pool:
  vmImage: 'ubuntu-16.04'

variables:
  buildConfiguration: 'Release'

steps:
- task: Bash@3
  displayName: 'Echo some Docker info from bash'
  inputs:
    targetType: 'inline'
    script: |
      echo 'docker info'
      docker info
      echo 'docker ps'
      docker ps

- task: Docker@2
  displayName: 'Copy sqlpackage to Docker SQL Server'
  inputs:
    command: 'cp'
    arguments: '$(Agent.BuildDirectory)/s/DBProject/_TestDb_/sqlpackage-linux/sqlpackage sqlserver_linux_container_for_integration_tests:/tmp/sqlpackage'

I want to access the Docker container sqlserver_linux_container_for_integration_tests from my Docker@2 step so I can copy the sqlpackage binary into it - the database .bacpac will later be copied, and sqlpackage will be run to restore the DB from it - but I get the error "Error: No such container:path: sqlserver_linux_container_for_integration_tests:/tmp". So it looks like either the Docker container isn't being started or it's not being made available to my task. In addition, docker ps in the first task shows no output.

So how can I make the Docker container available for my tasks to copy files to and execute commands on?

Upvotes: 2

Views: 1300

Answers (1)

Jez
Jez

Reputation: 30081

Managed to solve the problem.

First, I needed to declare that the job was dependent on the service for the container to be started (docker ps now lists it), by adding the following before the variables: section:

services:
  sqlserver_linux_container_for_integration_tests: sqlserver_linux_container_for_integration_tests

... but it still got a name assigned with some random characters tacked onto the end, preventing me from directly addressing it with the docker command. I noticed that an environment variable was created, however, containing JSON that mapped the container name to its ID:

AGENT_CONTAINERMAPPING = {
  "sqlserver_linux_container_for_integration_tests": {
    "id": "623aa2b3a1505da472d051e7845bfd5f20c979f0ff94fca63fb115e2d942e8b3"
  }
}

So I first used a PowerShell task to pull that ID out and export it to a variable:

- task: PowerShell@2
  name: varsCalc
  displayName: 'Calculate variables in Powershell'
  inputs:
    targetType: 'inline'
    script: |
      $extractedId = Echo '$(Agent.ContainerMapping)' | ConvertFrom-Json | Select -Expand sqlserver_linux_container_for_integration_tests | Select -Expand id
      Echo "##vso[task.setvariable variable=sqlServerTestDbId;isOutput=true]$extractedId"

I was then able to use that variable in Docker tasks where I needed the container ID, such as copying to [containerId]:/local/container/path:

- task: Docker@2
  displayName: 'TestDb - copy sqlpackage to SQLServer container'
  inputs:
    command: 'cp'
    arguments: '$(Agent.BuildDirectory)/s/DBProject/_TestDb_/sqlpackage-linux.tgz $(varsCalc.sqlServerTestDbId):/tmp/sqlpackage-linux.tgz'

Upvotes: 3

Related Questions