Colin Olliver
Colin Olliver

Reputation: 21

Azure Data Factory - Excluding folder within a container when using storage events triggers

I'm currently trying to find a solution to our trigger happy ADF triggers. I have taken over a Databricks / Delta lake ETL solution built on top of a Gen 2 data lake which uses ADF for orchestration.

The current setup uses the RAW > STAGE > BASE > ENRICHED data flow. When a file lands in any RAW folders in any of the containers, a pipeline is triggered that calls a Databricks notebook which automatically converts TXT/CSV/XLSX/JSON files into a delta table. This process is working as expected and saves us a considerable amount of time. The issue is that I don't want this trigger to fire off against every storage event.

Obviously I could be more prescriptive with the trigger but my concern is that there will be various processes that could potentially stop working in the process. So, I decided to see if there was a way in which to exclude specific folders within containers.

I have referred to the documentation here, but I can only see ways to include locations not exclude them.

enter image description here

Does anyone know if this is possible? If not I will have to look at a creating way more triggers than I want to cover all of our containers (25+).

Thanks

Upvotes: 0

Views: 1108

Answers (1)

AnnuKumari
AnnuKumari

Reputation: 563

As of now, there is no option to exclude folders or containers.

You can try this approach:

Keep all the needed folders with same naming convention .

For example: trigger_Folder1, trigger_Folder2 etc. and in the storage event trigger , inside Blob path begins with option , provide value as 'trigger_'

Upvotes: 1

Related Questions