Dhiraj
Dhiraj

Reputation: 3696

ADFv1 dependency problem for copy activity

I have a single activity pipeline already in ADFv1 producing an AzureBlob dataset D1. Now I want to create either another dependent activity or another dependent pipeline (anything is fine) in such a way that the child activity will be a copy activity -- so it should only detect if D1 is Ready , and if it is , it should perform Blob to Blob copy operation but the source dataset won't be D1, it will be a different AzureBlob dataset altogether, call it Dx . So the status of D1 is only going to be used as a trigger point. How to do this? If I simple provide D1 as input to the copy activity, it will try to copy D1 data itself, which is not what I want -- it should detect the status of D1 and then start copying contents of Dx to target dataset of the copy activity.

Upvotes: 0

Views: 45

Answers (1)

Jay Gong
Jay Gong

Reputation: 23782

Per my knowledge,there is no such exist built-in feature in ADF V1.So, i just provide my idea here for your reference.

Firstly,i don't know what's the standard of D1 is Ready,because dataset has so status in ADF.I guess you definitely know your own logic of that status.(Something like file name matches some format? Or it's specific time!) So,no matter what it is, just create a normal copy activity named Dx.

Then you could add some triggers on your D1.Here, maybe you have 2 options:

1.Azure Function Blob Trigger:https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-storage-blob-triggered-function

It's the real-time trigger,every inputs in your blob storage will trigger the function method.You could judge the Ready status and call the Pipeline Runs REST API in function inside.Negative point is that you need to authenticate when you call the REST API. Positive point is that you could debug your code as you want.

2.Logic App Blob Trigger:https://learn.microsoft.com/en-us/azure/connectors/connectors-create-api-azureblobstorage#add-blob-storage-trigger

As i know,it's not real-time trigger,timely scan instead.Then you judge the Ready status and run the pipeline directly in the ADF connector.Negative point is that it's difficult debugging in logic app logical judgment. Positive point is that you don't worry about ADF REST API auth!

Upvotes: 1

Related Questions