Tarta
Tarta

Reputation: 2063

Using parameters to locate file during trigger creation in Azure Data Factory

I am trying to create a trigger that I will use for starting a pipeline in ADF:

enter image description here

The folder I want to set my trigger on can have different paths:

etc..

Therefore in my Blob path begins with I would like to use a parameter (that I will set somewhere else through another pipeline) that tells the trigger where to look for instead of having a static name file.

Unfortunately it doesn't seem to give me the chance to add dynamic content as (for example) in a DataSet. If there is really no chance, because maybe I may think the trigger is something instantiated once, is there a way to create a trigger as a step during a pipeline?

Thank you!

Upvotes: 1

Views: 459

Answers (1)

HarithaMaddi-MSFT
HarithaMaddi-MSFT

Reputation: 551

There is possibility to pass parameter from "ARM Template" of the Azure Data Factory. During the deployment of pipelines, this parameter can be passed with necessary value. Below is example code for it.

Sample Code:

{
  "name": "[concat(parameters('factoryName'), '/trigger1')]",
  "type": "Microsoft.DataFactory/factories/triggers",
  "apiVersion": "2018-06-01",
  "properties": {
    "annotations": [],
    "runtimeState": "Stopped",
    "pipelines": [],
    "type": "BlobEventsTrigger",
    "typeProperties": {
      "blobPathBeginsWith": "[parameters('trigger1_properties_typeProperties_blobPathBeginsWith')]",
      "ignoreEmptyBlobs": true,
      "scope": "[parameters('trigger1_properties_typeProperties_scope')]",
      "events": [
        "Microsoft.Storage.BlobCreated"
      ]
    }
  }
}

Upvotes: 0

Related Questions