Is there any way to make Azure Data Factory pipelines wait in queue if infrastructure reaches maximum capacity and not fail?

Danish Ahmed Mehmuda 0 Reputation points
2024-06-26T14:57:09.1266667+00:00

I have an Event based trigger in Azure Data Factory which executes ETL pipelines every 5 mins. The pipeline has some Databricks notebook activities which executes via a cluster pool. In some cases, the pipelines are failing throwing this error: "INSTANCE_POOL_MAX_CAPACITY_REACHED (CLIENT_ERROR)". I don't want to increase the number of instances in the cluster pool, and I was wondering if there is any functionality that if there are no more instances to be allocated to the cluster, the pipeline should be in Queued or In-Progress state?

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,800 questions
{count} votes

2 answers

Sort by: Most helpful
  1. josh morrish 0 Reputation points
    2024-06-27T03:52:06.9233333+00:00

    Hi Danish,

    could you not try to Stagger your pipeline runs: If your pipelines are scheduled to run at the same time, consider staggering them so that they don’t all require resources from the cluster pool simultaneously.

    Or, Implement a retry mechanism: You can implement a retry mechanism in your pipeline to handle the “INSTANCE_POOL_MAX_CAPACITY_REACHED” error. When this error occurs, the pipeline could wait for a certain period of time and then retry the operation.

    0 comments No comments

  2. Smaran Thoomu 16,555 Reputation points Microsoft Vendor
    2024-06-30T15:20:35.48+00:00

    Hi @Danish Ahmed Mehmuda

    Thanks for the question and using MS Q&A platform.

    Adding to the above, there is a way to make Azure Data Factory pipelines wait in queue if the infrastructure reaches maximum capacity and not fail. You can achieve this by configuring the concurrency settings for your pipeline.

    Concurrency settings allow you to control the maximum number of pipeline runs that can be executed simultaneously. When the maximum concurrency is reached, any additional pipeline runs are queued and executed when resources become available.

    Here's how you can configure concurrency settings for your pipeline:

    1. Open your pipeline in the Azure Data Factory UI.
    2. Click on the "Settings" tab.
    3. Under "Concurrency control", select "Limited concurrency".
    4. Set the maximum concurrency to the desired value. This value should be less than or equal to the maximum number of instances in your cluster pool.
    5. Click "Save" to save the changes.

    With these settings, when the maximum concurrency is reached, any additional pipeline runs will be queued and executed when resources become available. This will prevent your pipelines from failing due to the "INSTANCE_POOL_MAX_CAPACITY_REACHED" error.

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.