Reputation: 2029
Big Picture: We are using Azure DevOps release process (so far we create steps in designer, not YAML pipelines). We release to 3 environments with 3 different databases. Part of the release is deploy database from DACPAC file. Since using SqlPackage.exe to publish directly to database is not very transparent (you don't see and review the actual SQL script), we wanted to do release in 2 stages:
Issue: How to share sql script file between stages and how to see it for approval. Stages can be triggered on different agent.
What I've tried:
Any other suggestions?
Upvotes: 10
Views: 3772
Reputation: 4556
Azure DevOps now allows to download files that have been published as artifacts in earlier stages of the pipeline.
In the code I am sharing, I have implemented SQL schema generation with database changes, followed by publishing those changes (after approval). Some remarks:
sqlpackage
is only correct when Visual Studio 2019 is installed, like in windows-2019
images. targetDBConnectionString
servername
databasename
adminlogin
adminPassword
ApplyChanges
stage (within Pipelines menu, choose environments, then the ApplyChanges
environment, and then approvals and checks
from the three dots button, on the top right corner). This way the changes are not applied to the database before manual approval takes place. stages:
- stage: 'Build'
displayName: 'Build the web application'
jobs:
(...)
- job: BuildDacpac
pool:
vmImage: 'windows-2019'
steps:
- task: DotNetCoreCLI@2
displayName: 'Build the database project'
inputs:
command: 'custom'
custom: 'build'
projects: 'FQDN\For\The\DB\Project\DBProjectName.sqlproj'
- task: CopyFiles@2
displayName: 'Copy dacpac file to a staging directory'
inputs:
contents: |
FQDN\For\The\DB\Project\bin\**\*.dacpac
targetFolder: '$(Build.StagingDirectory)'
- task: PublishBuildArtifacts@1
displayName: 'Publish build artifact'
inputs:
pathToPublish: '$(Build.ArtifactStagingDirectory)'
artifactName: dropDacpac
condition: succeededOrFailed()
- stage: VerifyScript
displayName: 'Script database schema changes'
dependsOn:
- Build
jobs:
- deployment: VerifyScript
pool:
vmImage: 'windows-2019'
variables:
- group: 'Timeline CV - Release'
environment: 'scriptverification'
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: dropDacpac
patterns: '**/*'
- task: CmdLine@2
displayName: 'Generate schema changes script'
inputs:
script: |
"c:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\140\sqlpackage.exe" ^
/action:script ^
/diagnostics:true ^
/sourcefile:$(Pipeline.Workspace)\dropDacpac\path\to\the\dacpacFile\package.dacpac ^
/targetConnectionString:$(targetDBConnectionString) ^
/outputpath:$(Build.StagingDirectory)\changesScript.sql
- task: PublishPipelineArtifact@1
inputs:
targetPath: '$(Build.StagingDirectory)'
artifactName: dropSqlSchemaChangesScript
condition: succeededOrFailed()
- task: PowerShell@2
displayName: Show Auto Generated SQL Script
inputs:
targetType: 'inline'
script: |
Write-Host "Auto Generated SQL Update Script:"
Get-Content $(Build.StagingDirectory)\changesScript.sql | foreach {Write-Output $_}
- stage: ApplyChanges
displayName: 'Apply database schema changes'
dependsOn: VerifyScript
jobs:
- deployment: ApplyChanges
pool:
vmImage: 'windows-2019'
variables:
- group: 'Timeline CV - Release'
environment: 'applyChanges'
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: dropSqlSchemaChangesScript
- task: SqlDacpacDeploymentOnMachineGroup@0
displayName: 'Deploy SQL schema changes script'
inputs:
taskType: 'sqlQuery'
sqlFile: '$(Pipeline.Workspace)\dropSqlSchemaChangesScript\changesScript.sql'
targetMethod: 'server'
authScheme: 'sqlServerAuthentication'
serverName: '$(servername)'
databaseName: '$(databasename)'
sqlUsername: '$(adminlogin)'
sqlPassword: '$(adminPassword)'
Upvotes: 0
Reputation: 114641
While you can't use pipeline artefacts, you could use Universal Packages in Package Management to publish arbitrary files for later retrieval. Just dumping the file contents to the pipeline logs is the easiest way to allow people to inspect it.
You could also create a file with placeholders as a build artefact and merge in the final settings from Pipeline variables in each stage, that way you can keep them as a build artefact. That's what I tend to do for any files of this nature. Sounds like this won't apply for your generated SQL file.
Alternatively, if the "seeing for approval" piece is important you could generate it and write it to the log, upload it to Universal Package Management. Then ask for approval at the end of the stage. In the next stage, you then download the script from Universal Package Management or you regenerate it using the exact same task configuration before execution.
Upvotes: 2