Stephen Bowser UK
Stephen Bowser UK

Reputation: 95

Deployment of Azure Data Factory with parameterized database connections

I have successfully created a parameterized linked service for my Azure SQL DB in Azure Data Factory. However, I am facing problems when I try to deploy it across environments. When I publish my data factory, the ARM Template Parameters file that gets generated only contains a single parameter for my linked service, and it is not populated

    "Generic_Databases_connectionString": {
        "value": ""
    },

Following the documentation the microsoft website, I figure I could overwrite this in my deployment with the correct parameterised value. Something like:

"Server=tcp:myserver.database.windows.net,1433;Database=@{linkedService().DBName};User ID=user;Password=fake;Trusted_Connection=False;Encrypt=True;Connection Timeout=30"

However, since my password is stored in Key Vault, i cannot simply include it here. I think the issue is that my ARM template parameters file is simply not being created properly? Has someone faced this issue?

How my dynamic connection looks in the ADF UI

Upvotes: 0

Views: 821

Answers (1)

Repcak
Repcak

Reputation: 1006

There are multiple ways to approach this. I would assume you use Azure Pipelines to deploy your ARM Templates. Overall you didn't give any specifics about how you deploy your ARM templates.

For both examples to work below I would create an ARM parameters.json file and keep it in the repository.

  1. You can use PowerShell scripts to change files. Here is an example:
- task: PowerShell@2
  inputs:
    targetType: 'inline'
    script: |
      $filePath = 'arm.parameters.json'
      $tempFilePath = "temp.arm.parameters.json"
      $find = 'dbSecret'
      $replace = 'VariableNameFromKV'
      
      (Get-Content -Path $filePath) -replace $find, $replace | Add-Content -Path $tempFilePath
      
      Remove-Item -Path $filePath
      Move-Item -Path $tempFilePath -Destination $filePath

  1. We personally use something like this: https://marketplace.visualstudio.com/items?itemName=qetza.replacetokens

Then you would tokenize the value in the parameters.json

    "Generic_Databases_connectionString": {
        "value": "#{NameOfKVSecret}#"
    },

Then in your Pipelines per environment you would run this task that replaces the tokenized value :

#YAML file
- task: a8515ec8-7254-4ffd-912c-86772e2b5962@3
  displayName: 'Replace Tokens in ARM parameters'
  inputs:
    rootDirectory: '$(Pipeline.Workspace)/drop'
    targetFiles: '**/fileToReplace.json'
    encoding: 'auto'
    writeBOM: true
    keepToken: false
    tokenPrefix: '#{'
    tokenSuffix: '}#'
    enableTelemetry: false
    actionOnMissing: 'continue'
    verbosity: detailed

#Rest of deployment tasks using the replaced parameters.json file

You would need to import variables into the pipeline. There are multiple ways to do that depending if you use classic pipelines or yamls. If you for example have seperate Key Vaults for your environments you could store those values there and use:

#YAML file
- task: AzureKeyVault@1
  displayName: Get KeyVault variables
  inputs:
    azureSubscription: '${{ parameters.KV_APP_NAME }}'
    KeyVaultName: '$(vg.keyVaultName)'

This task reads the values from the KV into the Pipelines as variables that you can use then in the deployment. In your case, you would have a stage per environment.

Both the above tasks can also be used in the classic pipelines.

Overall for this to work you should design the deployment pipelines in a way, that every environment has its own stage and the configuration values can be replaced per environment like so: enter image description here

Documentation for classic pipelines:

Documentation for yaml pipelines variables

Upvotes: 1

Related Questions