Reputation: 186
I've recently started working with Azure Functions and (after reading SO and Microsoft docs) have been having trouble understanding scale out and parallel execution.
My situation is a function app with CRUD Azure Functions - they need to react quickly and concurrently like a REST API. However, when testing on my own browser and running 10 different tabs, it seems the tabs finish consecutively/sequentially (one after the other, the last tab waiting a LONG time).
I was wondering if I am missing something, or if there is a way to allow for parallel execution using some other Azure product?
(I've read into a few application settings and possibly using APIM or hosting the functions, but these didn't seem to be the answer.)
Thanks!
Upvotes: 0
Views: 6965
Reputation: 472
I'm not sure of all your use cases, but I thought I would mention that if there are any use cases that do not need to be synchronous, you create an HTTP Trigger function that all it does is put the request into a queue. Then you can use a queue trigger function to process this.
However, be mindful of the queue trigger scale-out. I had a queue trigger that scaled out and ate up all the DB connections. Since this was part of an ETL job I attributed the queue function with
[Singleton(Mode = SingletonMode.Listener)]
Also, with queues you need to set up alerts on queue depth and poison messages.
Qupla! (Kinglon for good luck)!
Upvotes: 0
Reputation: 17544
I think the cold start problem has already been mentioned.
Other than paying more (App Service plan or Premium plan), one other option is to write a little more code to save a bunch of money.
?keepWarm=1
to your REST API endpoint you want to keep warm. Implementation of this function would be to return 200 if it's a keepWarm
call./endpoint?keepWarm=1
Host this whole thing in consumption plan.
Even for X = 1 second
, you'll probably end up paying a LOT less ($5-20) than other expensive plans ($100+ I think).
IMHO Premium and Dedicated plans are when you need more fire-power, not when you want to keep things warm. In fact Consumption plan would let you scale to 200 instances whereas the limit for other pricey plans is 10 to 100.
With pricey plans you do get more fire-power so you can do bigger tasks and take as long as you like:
($) unbounded execution time limit: If your trigger is HTTP Trigger (REST API) then this is useless due to load balancer limitation
If a function that uses the HTTP trigger doesn't complete within 230 seconds, the Azure Load Balancer will time out and return an HTTP 502 error. The function will continue running but will be unable to return an HTTP response.
This is very unclear.
As described here.
In the Consumption and Premium plans, Azure Functions scales CPU and memory resources by adding additional instances of the Functions host. The number of instances is determined on the number of events that trigger a function.
Each instance of the Functions host in the Consumption plan is limited to 1.5 GB of memory and one CPU. An instance of the host is the entire function app, meaning all functions within a function app share resource within an instance and scale at the same time.
Also it is possible to set FUNCTIONS_WORKER_PROCESS_COUNT
to control number of language worker processes
via Application Settings.
I guess in this case each language worker process would run within same host and would share resources.
Upvotes: 3
Reputation: 1243
I think you have a couple different issues you'll need to address:
The first is that your browser likely has a concurrent connection limitation to a single domain. Most modern browsers are going to limit this to 6. This limitation is not on a per-tab basis, but applies to all open tabs within the browser. So in your case, you have 10 open tabs, and best case scenario, 4 of those will be waiting for the other 6 to complete. You may want to look at something like Fiddler, or a tool specific to load testing to get around this limitation.
The next issue you're likely to run into is serverless cold-start. This is when serverless code that has not been executed in a while "unload". Then when called later, there's a spin-up time associated to prepare the function for execution. Here's a good image from This microsoft post
That same post gives a couple ideas for mitigating this. One is running your Azure Functions within an App Service, which you can set to always be running. This however mans your azure functions are no longer serverless.
The other option is to use pre-warmed instances available in the premium level plan of azure functions.
Lastly, you may need to expand the number of instances your plan can scale out to. If your plan is only set to scale out to 1 instance, then every single one of your calls is going to wait for the previous to complete. If you scale those out to a burst of 10, then they can all run concurrently.
Upvotes: 4