Hillol Saha
Hillol Saha

Reputation: 121

Azure Data Factory Storage Event Trigger File Path

I have the below scenario:

Any suggestion on the approach please.

Upvotes: 1

Views: 1529

Answers (1)

David Browne - Microsoft
David Browne - Microsoft

Reputation: 89406

Check out the cloudfiles streaming source in Databricks. See Streaming data processing and Databricks Autoloader. You can run the job continuously, or just once for each new file.

It will track the files for you, can run continuously or be triggered by Data Factory, but you don't need Data Factory to pass the filename(s) to Databricks.

Picking up the filename from the trigger event is also supported, and you can pass those as parameters to databricks.

Upvotes: 1

Related Questions