Getting error when running spark pool

Abhishek Ramamurthy 70 Reputation points
2024-02-22T20:28:08.0866667+00:00

Every time i run the notebook i get this error:
InvalidHttpRequestToLivy: Your Spark job requested 12 vcores. However, the workspace has a 0 core limit. Try reducing the numbers of vcores requested or increasing your vcore quota. Quota can be increased using Azure Support request https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/apache-spark-concepts#workspace-level HTTP status code: 400. Trace ID: c1c645bd-fa19-4eeb-b8ae-08a885f5c693.

I've tried with different core sizes like large 16vCores but still its the same. Can anyone please help me with it? thanks User's image

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,373 questions
{count} vote

Accepted answer
  1. PRADEEPCHEEKATLA 90,641 Reputation points Moderator
    2024-02-23T07:01:33.2566667+00:00

    @Abhishek Ramamurthy - Thanks for the question and using MS Q&A platform.

    Every Azure Synapse workspace comes with a default quota of vCores that can be used for Spark. The quota is split between the user quota and the dataflow quota so that neither usage pattern uses up all the vCores in the workspace. The quota is different depending on the type of your subscription but is symmetrical between user and dataflow. However if you request more vCores than are remaining in the workspace, then you will get the following error: InvalidHttpRequestToLivy: Your Spark job requested 12 vcores. However, the workspace has a 0 core limit. Try reducing the numbers of vcores requested or increasing your vcore quota. HTTP status code: 400..

    As per the error message, the number of V-cores are exhausted on the spark session. When you define a Spark pool, you are effectively defining a quota per user for that pool.

    The vcores limit depends on the node size and the number of nodes. To solve this problem, you have to reduce your usage of the pool resources before submitting a new resource request by running a notebook or a job.

    (or)

    Please scale down the node size and the number of nodes.User's image

    Note: Azure free trial/Student/Pass subscription are not eligible for a quota request. You need to upgrade to a Pay-As-You-Go subscription to increase the quota and credit card is the necessary to upgrade to pay-as-you-go subcription.

    If you still have questions, you can chat with sales team to get more information: https://azure.microsoft.com/en-in/free/students

    User's image

    To resolve this issue, you need to request a capacity increase via the Azure portal by creating a new support ticket.

    Step1: Create a new support ticket and select issue type as Service and subscription limits (quotas) and quota type as Azure Synapse Analytics.

    108984-image.png

    Step2: In the Details tab, click on Enter details and choose quota type as Apache Spark (vCore) per workspace , select workspace, and request quota as shown below.

    108986-image.png

    Step3: Select support method and create the ticket.

    For more details, refer to Apache Spark in Azure Synapse Analytics Core Concepts.

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.