GHariz
GHariz

Reputation: 338

Can Azure Data Factory V2 copy files that are updated every 20 seconds from on-premises to Azure Data Lake as soon as they change

I am trying to use ADF V2 to copy real time data from on-premises to Azure Data Lake Store. I already installed a self-hosted Integration Runtime on the on-premises server. Then I used the Copy Data wizard to create the pipeline and the datasets.

I tried using a tumbling window trigger to fire as soon the file changes, but saw that the minimum time it will fire is 15 minutes.

I saw that there is a Get Metadata activity that can get the Last Modified Time on a file system file. And I see that the Dataset allows parameters to be added. However, in V2 I see no Help button or examples on that. So I am assuming I would have to change the JSON file.

However, before I do that: the question is

can the pipeline be triggered - every 2 or 3 seconds (continuously) - to check if the files have changed and then copy them to Azure Data Lake store?

If that is possible, any example(s) would be really helpful.

Thanks

Upvotes: 1

Views: 351

Answers (1)

DraganB
DraganB

Reputation: 1138

What is your source on the on-premise server? You can use a combination of logic app and ADF to, so logic app can be triggered as soon as file arrives at your on-premise source side, and then the logic app can invoke data factory pipeline to copy that file.

Upvotes: 1

Related Questions