Hi Grzegorz Debowski,
Thank you for reaching out to us on the Microsoft Q&A forum.
You’ll encounter an error if you exceed the core limit for a region. To increase the core limit for the region, you must submit a support ticket.
Cause:
Azure applies quotas at various levels, such as resource groups, subscriptions, and accounts. For example, your subscription may have a regional core limit. If you try to deploy a virtual machine that exceeds the allowed cores, you'll get an error indicating the quota has been reached.
Solution:
Go to the Azure portal and file a support ticket. In the ticket, request a quota increase for the region where you need to create the VMs.
How to check Usage + Quotas for your subscription?
Select your subscription => Under Settings => Usage + quotas => Use filter to select "Standard Series" & "East US" => Check usage of Total Regional vCPUs => If the usage is full, you need to click on Request Increase to increase the limit of cores in the region.
To request a quota increase, open the Azure portal and submit a support ticket. In the ticket, specify the region where you need the quota increase for deployment.
For more details, refer to Resolve errors for resource quotas.
Note: Azure Databricks Cluster - multi node
is not available under the Azure free trial/Student/Pass subscription.
Reason: Azure Free Trial, Student, or Pass subscriptions have a 4-core limit, which prevents you from creating multi-node Databricks clusters (requiring more than 8 cores).
To create multi-node Databricks clusters, you must upgrade to a Pay-As-You-Go subscription.
Note: Student subscriptions are not eligible for quota increases. To bypass these limits, upgrade to a Pay-As-You-Go subscription.
You can use Azure Student subscription to create a Single node cluster which will have one Driver node with 4 cores.
A Single Node cluster consists of only a Spark driver without any Spark workers. It supports Spark jobs and all data sources, including Delta Lake. In comparison, Standard clusters require at least one Spark worker to run jobs.
When to Use Single Node Clusters:
- Machine Learning Workloads: Ideal for running single-node ML tasks that rely on Spark for data loading and saving.
- Exploratory Data Analysis (EDA): Useful for lightweight analysis with minimal resource requirements.
For more details: Azure Databricks - Single Node clusters.
Please reach out to us if you have any other queries.
If the information is helpful, please Accept Answer & Upvote so that it would be helpful to other community members.