Reputation: 21
I need to invoke a Data Factory V2 pipeline when a file is placed in a blob container.
I have tried using Powershell to check if the file is present, the issue I have there is that if the file is not there, and tell me its not there, I then place the file in the container and Powershell will still tell me its not there, though perhaps if it reruns the variable will get a fresh value and tell ites there? Maybe there is a way around that? If yes, I can then use the result to invoke the pipeline with the Powershell script. Am I along the right lines here?
Other option will be to write a t-sql query that will give a true/false result if the row condition is met, but I am not sure how I can use this result within/against DFv2. In the IF condition module?
Tried a Logic app but it was kind of useless. It would be great if I could get some suggestions in some ways to trigger the pipeline on the arrival of the file in the blob container, there is more than one way to skin a cat, so open to any and all ideas. Thank you.
Upvotes: 2
Views: 6543
Reputation: 2844
This is now available as an event trigger with ADF V2 as announced in this bog post on June 21, 2018.
Current documentation on how to set it up is available here: Create a trigger that runs a pipeline in response to an event.
From the documentation:
As soon as the file arrives in your storage location and the corresponding blob is created, this event triggers and runs your Data Factory pipeline. You can create a trigger that responds to a blob creation event, a blob deletion event, or both events, in your Data Factory pipelines.
There is a note to be wary of:
This integration supports only version 2 Storage accounts (General purpose).
Event triggers can be one, or both of:
Microsoft.Storage.BlobCreated
Microsoft.Storage.BlobDeleted
With firing conditions from the following:
blobPathBeginsWith
blobPathEndsWith
The documentation also provides the following examples of event trigger firing conditions over blobs:
Upvotes: 1