Reputation: 102
I want to design azure functions in Python language to read events from more than one event hub . The number of event hubs are not constant , it can increase or decrease (for example one event hub publishes logs from one azure AD tenant and in future AD tenants can be added). Each function will have same business logic ,only the even hub connection string will differ .
Event hub triggered function can use only one event hub , not more than more .
So, What is the best approach for this problem statement :
Creating 1 Event hub triggered function for each event hub and keep all functions under one azure function APP and to have shared code which can be used by all functions .
Cons : If a new event hub added , need to create a new function and deploy function standalone with Zero downtime of function APP ,which is not possible .As per azure doc ,the deployment unit should be Function APP, not individual functions .
Creating 1 function APP with single Event hub triggered function subscribing to one event hub .
Pros As new function app is different for each event hub, if a new event hub added other function apps won't be impacted and zero downtime for them
Challenge Assuming we have one Github repo with function logic, function.json , Is it possible to create a function app each time a new event hub added and create a deployment integrating with git repo. In future I want to use azure pipelines or Github actions for CI/CD , will this approach create any issue.
Upvotes: 1
Views: 707
Reputation: 6796
If the business logic is exactly the same, then you can extract the core logic and put it in an http trigger function. You only need to create multiple Azure logic apps to trigger and pass the content of the event hub to the function for processing.
This way you don't have to worry about the downtime caused by multiple deployments, and I think it is easier to create multiple Azure logic apps than to create multiple Azure functions.
You can refer to my logic app design:
Upvotes: 1