How to increase parallelism in Azure Functions with app service plan?

Tejashwini H 5 Reputation points
2023-03-27T06:14:12.98+00:00

Hello,
I have a blob triggered Azure Function Python Project under App Service Plan with 3 functions under a single app. The function app is with basic configurations and hence execution time of each functions is very slow. I tried to introduce Dynamic Concurrency and could see only

{"Timestamp":"2023-03-14T16:40:43.9369397Z","NumberOfCores":2,"FunctionSnapshots":{"SharedBlobQueueListener":{"Concurrency":1}}}

this with no performance improvement(Can you please explain what is "SharedBlobQueueListener"). I even tried


"FUNCTIONS_WORKER_PROCESS_COUNT": 4,
"PYTHON_THREADPOOL_THREAD_COUNT": 8,

but it was of minimal help.
So, can you please guide me for bringing in parallelism at the level of functions, function app, worker and host and even better ways of performance improvement methods with respect to App Service Plan.

At the same time can you kindly explain me briefly,

  1. Dynamic Concurrency,
    1. MaxDegreeOfParallelism,
  2. Dynamic Concurrency vs Python__ThreadPool_Thread__Count and they would work together,
  3. FUNCTIONS_WORKER_PROCESS_COUNT vs WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT
  4. Dynamic Concurrency wrt App Service Plan.

Thank you.

Azure Functions
Azure Functions
An Azure service that provides an event-driven serverless compute platform.
4,978 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Andriy Bilous 11,421 Reputation points MVP
    2023-03-27T14:27:24.1566667+00:00

    Hello @Tejashwini H

    Sure, let me explain:

    SharedBlobQueueListener

    • This log entry is not directly related to your Azure Functions concurrency settings. It represents the internal name for the listener that processes the blob triggers. In this case, it shows that there's only 1 listener running.

    Dynamic Concurrency

    • A feature that automatically adjusts the number of concurrent function invocations based on the performance of the Function App. It aims to optimize throughput and resource usage.

    MaxDegreeOfParallelism

    • The maximum number of concurrent invocations of a function. This setting is for Durable Functions, which run on the Durable Task Framework. It's not directly related to the Dynamic Concurrency feature.

    Dynamic Concurrency vs Python__ThreadPool_Thread__Count

    • Dynamic Concurrency manages the number of concurrent function invocations, while Python__ThreadPool_Thread__Count manages the number of threads used by the Python ThreadPoolExecutor. In general, Python__ThreadPool_Thread__Count can be used to increase parallelism within a single function instance, while Dynamic Concurrency affects the parallelism of function invocations.

    FUNCTIONS_WORKER_PROCESS_COUNT vs WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT

    • FUNCTIONS_WORKER_PROCESS_COUNT sets the number of worker processes per instance, while WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT sets the maximum number of instances for autoscaling when using a Consumption Plan.

    Dynamic Concurrency with respect to App Service Plan

    • Dynamic Concurrency can be used with both Consumption and Premium plans but is not available for the App Service Plan. In the App Service Plan, you need to manually configure the scaling options (scale out and scale up)

    To improve performance of App Service Plan use following options

    • Scale out: Increase the number of instances (VMs) your Function App runs on. This can be done by going to your Function App in the Azure portal, then under "Settings," select "Scale out (App Service Plan)." Increase the instance count manually, or enable autoscaling based on rules.
    • Scale up: Upgrade to a higher-tier App Service Plan to get more resources per instance. Go to your App Service Plan in the Azure portal, then under "Settings," select "Scale up (App Service Plan)." Choose a higher-tier plan with more CPU cores and memory.

    FUNCTIONS_WORKER_PROCESS_COUNT and PYTHON_THREADPOOL_THREAD_COUNT settings are used to configure the concurrency and parallelism for Azure Functions when using Python.

    • PYTHON_THREADPOOL_THREAD_COUNT: This setting controls the number of threads used by the Python ThreadPoolExecutor within a single function instance. ThreadPoolExecutor is a class in the concurrent.futures module that can be used to execute multiple tasks concurrently using a pool of worker threads. Increasing the number of threads can improve the parallelism of tasks within a single function instance, but it's important to ensure that the tasks being executed are thread-safe and not CPU-bound. If the tasks are CPU-bound, adding more threads may not improve the performance and could even lead to worse performance due to thread switching overhead.
    • FUNCTIONS_WORKER_PROCESS_COUNT: This setting controls the number of worker processes that the Functions host spawns per instance. Each worker process runs your functions separately and in parallel. By default, this value is set to 1, meaning only one worker process is running per instance. Increasing the number of worker processes can improve the overall throughput and performance of your Function App. However, be cautious not to set this value too high, as it may cause resource contention and decrease performance.
    2 people found this answer helpful.

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.