Reputation: 347
I am in Databricks. I have a notebook that relates n tables using join and other functions.
I would like to create a workflow that automatically runs this notebook when at least one of the input tables is updated, considering that these tables will be updated by separate flows at different times of the day.
Is it possible? I do not understand how I should proceed on the Workflow (without using time schedule or other notebook).
Thanks for your help!
Upvotes: 0
Views: 90
Reputation: 1
This is not currently possible in Databricks, but there is a feature in private preview that will allow exactly what you describe - table triggers (see here: https://www.linkedin.com/posts/laurentdhondt_databricks-azure-deltalake-activity-7200756908134154240-r89H/). Databricks announced it last year, but it's not yet been added to the main product.
Upvotes: 0