frictionlesspulley
frictionlesspulley

Reputation: 12438

Azure Data Factory v2 parameters for connection string

I am new to using Azure Data Factory v2 and have a few questions regarding general transforming connection strings / LinkedServices when deploying to multiple environments.

Coming from SSIS background:

we used to define connection strings as project parameters. This allowed transforming the connecting string when deploying the artifacts onto different environments.

How can I accomplish the same using Azure Data Factory v2 ? Is there an easy way to do this ?

I was trying to set up linked services with connection strings as parameters which then could be passed along with the triggers? Is this feasible ?

Upvotes: 0

Views: 3136

Answers (3)

frictionlesspulley
frictionlesspulley

Reputation: 12438

I ended up solving this issue with setting up an azure key vault per environment each having a connection string secret (more details here : https://learn.microsoft.com/en-us/azure/data-factory/store-credentials-in-key-vault)

- dev
    - dev-azure-datafactory
    - dev-key-vault
     - key: db-conn-string
       value: dev-db.windows.net
- qa
    - qa-azure-datafactory
    - qa-key-vault
     - key: db-conn-string
       value: qa-db.windows.net

- production
    - prod-azure-datafactory
    - prod-key-vault
     - key: db-conn-string
       value: prod-db.windows.net

In Azure Data Factory

  • Define an Azure Key Vault linked service

  • Use the azure key vault linked service while defining connection string(s) for other linked services

  • This approach removes any changing of parameters in the actual linked service

  • The connection string with azure key vault linked service can be changed as part of your azure pipeline deployment (more details here : https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment)

  • Each azure data factory can be given access to its azure key vault using MSI (automated it with terraform in our case)

Upvotes: 0

Daryl Macam
Daryl Macam

Reputation: 416

This feature is now avaialble from URL below. Are you the one who requested the feature? :)

https://azure.microsoft.com/en-us/blog/parameterize-connections-to-your-data-stores-in-azure-data-factory/

Upvotes: 1

Abhishek
Abhishek

Reputation: 2490

Relating to SSIS (where we would use configuration files - .dtsconfig for deployment to different deployments), for ADFV2 (& ADFV1 too) we could look into the option of using ARM templates where for every different environment (dev, test & prod) to deploy the ADF solution that many deployment files(.json) could be made and script the deployments using PowerShell. It is possible to use ARM template parameters to parameterize connections to linked services and other environment specific values. Then there are ADFV2 specific PowerShell cmdlets for creation/deployment of ADFV2 pipelines.

Also you can use PowerShell to parametrize connections to linked services and other environment specific values.

With the ADFV2 UI the VSTS GIT integration is possible so is the deployment and integration. VSTS GIT integration allows to choose a feature/development branch or create a new one in the VSTS GIT repository. Once the changes are merged with the master branch it could be published to data factory using ADFV2 UI.

Upvotes: 0

Related Questions