Automatically triggered pipeline run. There are not enough resources available in the workspace

Rita Wang 0 Reputation points
2023-09-21T15:03:11.5533333+00:00

Been running several pipelines on a trigger in Azure Synapse Analytics. Each pipeline has a set of tables it needs to pass from one level to the next. Each pipeline loops through those set of tables (ForEach).

I have 6 pipelines that are technically running in parallel, each has about 8-10 tables (iterations of ForEach).

Let's say Pipeline A has one that failed (one iteration of ForEach, which is one table not being copied to the next level) because of the following error:

{
    "errorCode": "3250",
    "message": "There are not enough resources available in the workspace, details: 'Your job requested 8 vcores. However, the workspace only has 2 vcores available out of quota of 50 vcores for node size family [MemoryOptimized]. Try ending the running job(s), reducing the numbers of vcores requested or increasing your vcore quota. https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/apache-spark-concepts#workspace-level'",
    "failureType": "UserError",
    "target": "Dataflow_D365_FO_DELETE_SILVER",
    "details": []
}

However, these pipelines are triggered in 2 hours intervals, and the activity that failed differs every time. For example, pipeline A will have 2 activity that failed this time, and then next time it will be a different pipeline A activity. Thus this problem is hard to isolate what is going on.

Or the next time it'll be a different pipeline's activity that failed on the resource issue.

Lastly, I have increased the compute size for Pipeline A's DataFlow. The problem persists. If someone could explain how I can resolve this issue and how to either increase the workspace vcores or how to decrease the number of cores each job requests. Anything that can help me resolve this issue would be great.

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,859 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. Amira Bedhiafi 23,016 Reputation points
    2023-09-21T15:30:01.6566667+00:00

    I guess your problem is related to resource contention, specifically virtual CPU cores (vcores).

    Some pipeline activities allow you to set the minimum and maximum number of vcores they will use. Adjust these settings to better align with your available resources. (like for example 10)

    If it doesn't work, you might need to request an increase in vcore quota from Azure support. Or try switching to a different node family if your current one is too resource-constrained.

    0 comments No comments

  2. Bhargava-MSFT 30,816 Reputation points Microsoft Employee
    2023-09-26T20:12:15.7166667+00:00

    Thank you, Amira Bedhiafi

    Hello Rita Wang,

    <In addition to Amira's answer>

    When you run data flow using for each activity

    Ex: for parallel run.

    if there are 32 iterations in your foreach and each iteration requested a single v-core, then in total your pipeline run you will need 32 Vcores for that pipeline run.

    ex: for a sequential run

    If there are 32 iterations in your foreach and each iteration requested a single v-core, then each iteration will be isolated and receive vcore for each iteration(the same vcores will be re-used).. it doesn't need 32 vcores at the same time to run the pipeline.
    in the sequential run, compute resources(vcores) will reuse

    in your case, you have only 2 vcores available n, and those v-cores are not sufficient to run the pipeline.

    Reference document: https://learn.microsoft.com/en-us/azure/data-factory/concepts-data-flow-performance-pipelines

    You can request a capacity increase via the Azure portal by creating a new support ticket.

    If you would like to increase the capacity, please follow this thread.

    A similar question was discussed here

    I hope this helps. In case if you have any further questions, please let us know.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.