Reputation: 117
We have multiple (Java) Azure Functions that we deploy multiple times (for now ~10, in the future might be hundreds) using Terraform (to different customers' environments).
These functions are being build and deployed using our CI/CD pipeline. At the end of the pipeline, we have a ZIP file that we put in Azure Blob Storage (eg myFunction-latest.zip).
When we deploy a function (using Terraform), we supply a SAS-URL (valid for a long time) to this zip (myFunction-latest.zip) in the "WEBSITE_RUN_FROM_PACKAGE" appsettings of the function. This works great. Using the SAS-url, the function pulls the zip from BLOB-storage and starts the function.
My question is how we should handle updates to the Function's source. Our CI/CD will overwrite the myFunction-latest.zip in blob storage, but how will these (potentially hundreds) functions know it changed? According to the documentation, we need to 'sync triggers'. Syncing triggers can be done by
I have several options to do this, which would be best? We would like a 'pull-based' approach so that we don't have to push changes to 100's of clients' environments.
What would be the best option? I am afraid of the overhead of using option 3, so I'm thinking of going with option 2.
Upvotes: 1
Views: 2629
Reputation: 360
I think a better solution would be to version the zip packages you upload to Blob Storage. This would update the app setting on the function app which would then force a restart of the app service. The restart would force the function app to fetch the new package.
You could use the build number from the CI/CD pipeline to name your zip packages since every pipeline run would get a unique build number.
Upvotes: 0
Reputation: 6647
Option two is a good option to consider. Like you've mentioned, it would have sufficient permissions to do its job without external services needed access to the environment.
You could enhance it to avoid polling (especially if you don't see lots of updates on a regular basis or want near instant updates) by deploying a logic app in each customer environment that implements the webhook trigger pattern.
As per the above doc, the Custom API (could be a durable function) would be running in your environment, which all customer logic apps would subscribe to.
When a newer version of your function app package is uploaded, your CI/CD pipeline would trigger your Custom API, which in turn would trigger all the subscribed Logic Apps.
These Logic Apps would just call Sync Triggers on their respective function apps.
You could implement a similar function as well for option three using durable functions and its external events feature as well, with security in place.
The below is currently in Public Preview
Another interesting way to achieve this is using Azure Event Grid Partner Topics. You would have setup the Event Grid Partner side of things in your subscription and have a site/form where your customers would go to register for events , thereby creating a partner topic in their subscription.
Then a logic app or function could listen to this topic for events.
Upvotes: 1