Yes, that's correct. The Consumption plan has a limit of 100 instances per Linux Function App, as you mentioned. If you upload 100 files to the container, the consumption plan will scale out to create up to 100 concurrent instances of your Function to handle the 100 events. This means that if you upload more than 100 files in a short period, the remaining triggers will be queued and processed once the instances become available. So, if consumption plan is not able to scale out further, some of the events may not be processed immediately.
About Azure Functions: quotas & limits
Hi,
According to the docs::
https://learn.microsoft.com/en-us/azure/azure-functions/functions-scale#service-limits
Azure Functions instance have up to 1,5GB RAM.
I've been using a function deployed on Azure that read a 100MB pandas dataframe and partition it in almost 40 smaller dataframes. The function sometimes finishes suddenly and others do its job, even when it takes long
I've test locally reading the same dataframe and perform the same task, and my laptop memory spin up to 21GB-25GB while splitting, using thins function.
I could now understand why the function finishes suddenly, let's say it works half and half.
¿Why doesn't the Function finishe all the execution, given that I know now that the operations on that dataframe goes beyong the 1,5GB RAM on Consumption Plan? I can't understand while the Function executes fine (even when it takes very long) given that I surpass its RAM limit.
How can I monitor the memory usage of a Function? is that posible?
Can somebody give me a hand to understand this?
Regards