Abhi
Abhi

Reputation: 351

How to update ADF Pipeline level parameters during CICD

Being novice to ADF CICD i am currently exploring how we can update the pipeline scoped parameters when we deploy the pipeline from one enviornment to another. Here is the detailed scenario -
I have a simple ADF pipeline with a copy activity moving files from one blob container to another
Example - Below there is copy activity and pipeline has two parameters named :
1- SourceBlobContainer
2- SinkBlobContainer
with their default values.

enter image description here

Here is how the dataset is configured to consume these Pipeline scoped parameters.

enter image description here

Since this is development environment its OK with the default values. But the Test environment will have the containers present with altogether different name (like "TestSourceBlob" & "TestSinkBlob").
Having said that, when CICD will happen it should handle this via CICD process by updating the default values of these parameters.

When read the documents, no where i found to handle such use-case.
Here are some links which i referred -

Upvotes: 0

Views: 3300

Answers (3)

Christopher Liquori
Christopher Liquori

Reputation: 1

You would have to use the arm-template-parameters-defintion.json file. You can edit the file directly in ADF.

Edit Parameters File

Example to include all pipeline parameters in your ARM template:

"Microsoft.DataFactory/factories/pipelines": {
    "properties": {
        "parameters": {
            "*": {
                "defaultValue": "="
            }
        }
    }
}

Source: https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-delivery-resource-manager-custom-parameters

Upvotes: 0

Redbeard
Redbeard

Reputation: 293

There is a "new" way to do ci/cd for ADF that should handle this exact use case. What I typically do is add global parameters and then reference those everywhere (in your case from the pipeline parameters). Then in your build you can override the global parameters with the values that you want. Here are some links to references that I used to get this working.

The "new" ci/cd method following something like what is outlined here Azure Data Factory CI-CD made simple: Building and deploying ARM templates with Azure DevOps YAML Pipelines. If you have followed this, something like this should work in your yaml:

overrideParameters: '-dataFactory_properties_globalParameters_environment_value "new value here"'

Here is an article that goes into more detail on the overrideParameters: ADF Release - Set global params during deployment

Here is a reference on global parameters and how to get them exposed to your ci/cd pipeline: Global parameters in Azure Data Factory

Upvotes: 0

Kamil Nowinski
Kamil Nowinski

Reputation: 545

There is another approach in opposite to ARM templates located in 'ADF_Publish' branch. Many companies leverage that workaround and it works great.
I have spent several days and built a brand new PowerShell module to publish the whole Azure Data Factory code from your master branch or directly from your local machine. The module resolves all pains existed so far in any other solution, including:

  • replacing any property in JSON file (ADF object),
  • deploying objects in an appropriate order,
  • deployment part of objects,
  • deleting objects not existing in the source any longer,
  • stop/start triggers, etc.

The module is publicly available in PS Gallery: azure.datafactory.tools
Source code and full documentation are in GitHub here.
Let me know if you have any question or concerns.

Upvotes: 1

Related Questions