Increasing your vcore quota

Rickus du Plooy 1 Reputation point
2023-06-15T12:18:16.6+00:00

I am getting the following error when trying to run hello world code on a newly create spark pool in my newly created synapse workspace:

InvalidHttpRequestToLivy: Your Spark job requested 12 vcores. However, the workspace has a 0 core limit. Try reducing the numbers of vcores requested or increasing your vcore quota. HTTP status code: 400.

How can this be resolved, all that I found so far is that I should raise it with support, but I get stuck where they want me to pay 30 USD to raise a ticket. Any other ways to get this resolved?

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,384 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,582 questions
{count} votes

1 answer

Sort by: Most helpful
  1. PRADEEPCHEEKATLA-MSFT 77,426 Reputation points Microsoft Employee
    2023-06-16T09:36:29.19+00:00

    @Rickus du Plooy - Thanks for the question and using MS Q&A platform.

    Every Azure Synapse workspace comes with a default quota of vCores that can be used for Spark. The quota is split between the user quota and the dataflow quota so that neither usage pattern uses up all the vCores in the workspace. The quota is different depending on the type of your subscription but is symmetrical between user and dataflow. However if you request more vCores than are remaining in the workspace, then you will get the following error: InvalidHttpRequestToLivy: Your Spark job requested 12 vcores. However, the workspace has a 0 core limit. Try reducing the numbers of vcores requested or increasing your vcore quota. HTTP status code: 400..

    As per the error message, the number of V-cores are exhausted on the spark session. When you define a Spark pool, you are effectively defining a quota per user for that pool.

    The vcores limit depends on the node size and the number of nodes. To solve this problem, you have to reduce your usage of the pool resources before submitting a new resource request by running a notebook or a job.

    (or)

    Please scale up the node size and the number of nodes.User's image

    To resolve this issue, you need to request a capacity increase via the Azure portal by creating a new support ticket.

    Step1: Create a new support ticket and select issue type as Service and subscription limits (quotas) and quota type as Azure Synapse Analytics.

    108984-image.png

    Step2: In the Details tab, click on Enter details and choose quota type as Apache Spark (vCore) per workspace , select workspace, and request quota as shown below.

    108986-image.png

    Step3: Select support method and create the ticket.

    For more details, refer to Apache Spark in Azure Synapse Analytics Core Concepts.

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    3 people found this answer helpful.