Question about Databricks Cluster Spark Config

Marco117 80 Reputation points
2023-07-27T15:29:35.9733333+00:00

Good morning. According to the resources assigned to the cluster where I work, we have 4 cores as Drivers and 4 cores as Workers. My question is: Is it possible to change the distribution of these cores in the Spark config panel? I would like to have at least 6 Worker cores and 2 Driver cores.

Captura de pantalla 2023-07-27 102249

This would be my spark config:

spark.driver.cores 2 
spark.executor.cores 6 

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,527 questions
0 comments No comments
{count} votes

Accepted answer
  1. Bhargava-MSFT 31,261 Reputation points Microsoft Employee Moderator
    2023-07-27T20:51:47.8433333+00:00

    Hello Jean Marco Varón Perengueza,

    Welcome to the Microsoft Q&A forum.

    In Databricks spark architecture, it may not be possible on a multi-node cluster. However, when you use a single node cluster, you can change the driver and worker cores ( driver and workers are on the same node)

    Please try this and let us know if you see any issues

    To update this:

    Click on Edit your cluster(top right corner) and go to Advanced Options

    Enter the below on the spark configuration.

    spark.driver.cores 2

    spark.executor.cores 6

    Once you update the values, click on confirm and restart (this will restart the cluster).

    User's image

    User's image

    I hope this helps.

    https://learn.microsoft.com/en-us/azure/databricks/clusters/configure

    If this answers your question, please consider accepting the answer by hitting the Accept answer and up-vote as it helps the community look for answers to similar questions.

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.