Can't create databricks cluster

TRẦN ĐỨC PHÚ 1 Reputation point
2022-10-14T08:52:45.857+00:00

Hello everyone,
I am using subscription student, I cannot create Databricks Cluster. My friend also uses a similar subscription but she can create one. I got a lot of error messages as: - Timeout

  • Quota full; error while requesting for quota increase
  • Cannot have more than x number of Public IP addresses
    When I read the documentation it's confusing I don't know how to solve this problem. I tried a few pages but when I followed it it didn't work. I hope Admin and everyone can guide me how to fix, how to create databricks cluster. Thank you everyone for reading my post, I look forward to hearing from you soon.
    250412-image.png

250405-image.png

250376-image.png

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,514 questions
{count} votes

1 answer

Sort by: Most helpful
  1. PRADEEPCHEEKATLA 90,641 Reputation points Moderator
    2022-10-17T06:24:51.143+00:00

    Hello @TRẦN ĐỨC PHÚ ,

    Thanks for the question and using MS Q&A platform.

    This is a known issue - when you create a cluster with multi node.

    Azure Databricks Cluster - multi node is not available under the Azure free trial/Student/Pass subscription.

    Reason: Azure free trial/Student/Pass subscription has a limit of 4 cores, and you cannot create Databricks cluster multi node using a Student Subscription because it requires more than 8 cores.

    You need to upgrade to a Pay-As-You-Go subscription to create Azure Databricks clusters with multi mode.

    Note: Azure Student subscriptions aren't eligible for limit or quota increases. If you have a Student subscription, you can upgrade to a Pay-As-You-Go subscription.

    You can use Azure Student subscription to create a Single node cluster which will have one Driver node with 4 cores.

    A Single Node cluster is a cluster consisting of a Spark driver and no Spark workers. Such clusters support Spark jobs and all Spark data sources, including Delta Lake. In contrast, Standard clusters require at least one Spark worker to run Spark jobs.

    Single Node clusters are helpful in the following situations:

    • Running single node machine learning workloads that need Spark to load and save data
    • Lightweight exploratory data analysis (EDA)

    250966-image.png

    Reference: Azure Databricks - Single Node clusters

    Hope this helps. Do let us know if you any further queries.

    ------------------------------

    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    • If you are interested in joining the VM program and help shape the future of Q&A: Here is jhow you can be part of Q&A Volunteer Moderators
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.