Gav Cheal
Gav Cheal

Reputation: 57

Azure Factory Pipeline set up using parameters

I'm now getting very frustrated as I am not getting the desired results.

What I am trying to do is create a pipeline which set the parameters dynamically from the name of the Data Factory.

So I have a 3 data Factories, all named exactly the same except for the instance designation.

Apparently I can get that information from using the command, @pipeline().DataFactory. I still have found a method that tells me what is being being help however.

From this I can get the instance for example the names of the Data factories are as follows:

If I can use substring(@pipeline().DataFactory,13,1) this should give me either 'd', 't' or 'p' as the Instance. Depending on whether the array is 0 based or not, again I have no idea from reading the guides.

So once I have this information, I then have something that I can use to determine which structure to use for the connection.

I have a pipeline that I have created that uses several datasets, so it's the datasets that I am trying to make dynamic.

I have set up these so that they are using the Linked service, and I can manually fill in the values and the connections work fine. for example:-

So what I am now trying to do is set up Parameters that would be: if(equals(@{dataset().Instance}),'p'),'db_prod',equals(@{dataset().Instance}),'t'),'db_test','db_dev')

I should then be able to set the linked service parameter == dynamic parameter.

This method doesn't seem to work. I am getting this error:

Check the linked service configuration is correct, and make sure the SQL Database firewall allows the integration runtime to access

I'm guessing it has to be something to do with how I am trying to pass the parameters around, but I cannot figure out how to get around this, as I haven't managed to find a way to see what the values are of these parameters, unless they are hardcoded as values for the linked service, then the method works.

Firstly, does this sound feasible? and has anybody else done this and how did they get around this issue.

Many thanks

Upvotes: 0

Views: 153

Answers (1)

Bright Ran-MSFT
Bright Ran-MSFT

Reputation: 14104

Here are few suggestions maybe you can consider as reference:

  1. You can try to conditionally, dynamically set a variable.

    • Add a shell script task in the job of your pipeline to conditionally, dynamically set a variable (Bash as example here).
      Instance=@{dataset().Instance}
      if [[ $Instance = p ]]; then
        echo "##vso[task.setvariable variable=database]db_prod"
      elif [[ $Instance = t ]]; then
        echo "##vso[task.setvariable variable=database]db_test"
      else
        echo "##vso[task.setvariable variable=database]db_dev"
      fi
      
    • After above task, in the subsequent steps (tasks) in the same job, you can access the variable $(database).
  2. If you are using a YAML pipeline and the dynamic parameter is used as an input of a task, you can try using an if clause to conditionally set the task input.

    For example:

    steps:
    - task: {task name}@{version}
      inputs:
        . . .
        ${{ if(equals(@{dataset().Instance}),'p') }}:
          database: 'db_prod'
        ${{ if(equals(@{dataset().Instance}),'t') }}:
          database: 'db_test'
        ${{ if(equals(@{dataset().Instance}),'d') }}:
          database: 'db_dev'
    

Upvotes: 1

Related Questions