Reputation: 3583
I have an azure function v3, written in C#, using the class library approach.
The problem is that cold starts can be as long as 30 minutes! I have consulted the documentation from this link
But there is no specific figures with regards to expected cold start timings.
An interesting observation is that if I navigate to the portal, and hit Refresh button:
Then the function gets immediately triggered.
Upvotes: 3
Views: 23099
Reputation: 11
To mitigate cold start issues, you can use a keep-alive mechanism for Azure Functions.
There are three mechanisms that you mayconsider:
Upvotes: 0
Reputation: 708
That is not correct, you're referring to a timeout duration - which is not equivalent to a cold start. If your function takes that long, its cause there was an exception/error (I don't mean an exception in code). No where in any documentation (even by microsoft) does it say that cold starts takes 30 mins.
If cold starts are taking too long (>~1 min) and you're sure there arent any startup errors - break your functions down and trigger those functions parallelly (where possible). Application Insights is your friend here.
Lastly, there are no exact figures - its such a per use basis. There are many things that effects execution time - which might lead one to blame cold starts, rather than the responsibilities & dependencies of a function. With Functions/Lambda, its not just "code like any other API", there is also some architectural changes.
What I would definitely like to know is the performance difference (esp cold start) if the code is published to Azure Function as a code vs in a docker (https://learn.microsoft.com/en-us/azure/devops/pipelines/targets/function-app-container?view=azure-devops&tabs=yaml). I'll try to find some time to perform a good experiment on this and report back (no promises).
Further Readings -
EDIT: Though this is slightly old Github Issue Thread, the maximum reported cold start time on this thread is about 2 mins; seems the average is somewhere around 20s. Note the Engineering Manager in Microsoft Azure (David Ebbo) at the time mentions that it shouldn't take longer than 10s for trivial functions. https://github.com/Azure/azure-functions-host/issues/838
EDIT 2: The 30 mins timeout duration is for non-consumption plans (5mins for consumption plan), so I assume you're planning to use non-consumption plans, in which case you're not going to have cold start issues. The other plans keep the functions warm. Please go through "Overview Of Plans" in the document you linked - https://learn.microsoft.com/en-us/azure/azure-functions/functions-scale#overview-of-plans
Upvotes: 7
Reputation: 222720
In general cold starts occur when your function hasn’t run in ~20 minutes
To avoid it, if you run your code frequently enough that it stays warm cold starts won’t occur (unless if you scale out)
A common way for doing this is making a simple “invoker” helper function which calls your function every 5/10 minutes
if that doesn’t fit for your solution and if you’re ok to pay slightly more, you can run your Function in the Dedicated plan.
Regarding SLA on Consumption Plan:
“Unavailable Executions” is the total number of executions within Total Triggered Executions which failed to run. An execution failed to run when the given Function App history log did not capture any output five (5) minutes after the trigger is successfully fired.
Upvotes: 1
Reputation: 207
Consumption plan is what azure calls "serverless" model; what it means -
your code reacts to events, effectively scales out to meet whatever load you’re seeing, scales down when code isn’t running, and you’re billed only for what you use.
Cold start is nothing but the
phenomenon that applications which haven’t been used take longer to start up.
When you're using the Consumption plan, instances of the Azure Functions host are dynamically added and removed based on the number of incoming events.
If you've a "heavy-weight" code written and deployed on consumption plan which takes lot of memory and resources to get loaded to execute it may take more time such as your case. When you hit the refresh button from portal - The Functions runtime resets, and any required extensions are loaded onto the worker and gets loaded into the memory. That's why it reduces most of the latency.
To understand more about the cold-start in azure serverless model & how you can minimize that please refer here - Understanding serverless cold start
To reduce it more; you can have a warmup request which will hit based on time interval and you'll have your function always loaded in memory.
Upvotes: 4