Xkid
Xkid

Reputation: 409

How to execute a pipeline just once no matter how many blobs are created? (Azure Data Factory)

I've created a Pipeline that's executed by a trigger every time a blob is created, the problem is that are scenarios where the process needs to upload multiple files at the same time, when it happens, the pipeline executes as many times as the number of blobs and it causes that the data is wrong. I tried to cofigure a Copy Data Activity in the main Pipeline in order to copy every blob created, but since this pipeline is inside the first one, it executes many times as well.

Upvotes: 0

Views: 358

Answers (2)

Nandan
Nandan

Reputation: 4925

Any reason why you are mapping your event trigger to the original path source where all files are being created and uploaded ? Cant you create a dummy blob path at the end with a dummy file to have a final trigger once all files are uploaded to overcome this issue?

note: this is how we manage this :) but there is a redundant file generated unfortunately

Upvotes: 1

KarthikBhyresh-MT
KarthikBhyresh-MT

Reputation: 5034

What you can do is filter the copy activity source based on the property Filter by last modified, where you can specify a start time and end time in UTC.

you can try this Incrementally copy new and changed files based on LastModifiedDate by using the Copy Data tool

OR...

Here as per your scenario, just mention the Start time.

  1. This start time is nothing but the last time a triggered pipeline run was executed! You can can get the Triggered pipeline run details using a REST API call Trigger Runs - Query By Factory.
  2. Now you can choose to query the runs that were executed in the last x hours or to be safe in the last day based on how frequent you have the files created in Storage.
  3. Next, from this result collect only triggerRunTimestamp and append to a array variable.
  4. Find the Max or last run time using functions. Set this time as the StartTime in UTC for the copy activity source filter as explained at the start.

If this is feasible, I can spin an example pipeline.

Upvotes: 2

Related Questions