Hello Jean Marco Varón Perengueza,
Welcome to the Microsoft Q&A forum.
In Databricks spark architecture, it may not be possible on a multi-node cluster. However, when you use a single node cluster, you can change the driver and worker cores ( driver and workers are on the same node)
Please try this and let us know if you see any issues
To update this:
Click on Edit your cluster(top right corner) and go to Advanced Options
Enter the below on the spark configuration.
spark.driver.cores 2
spark.executor.cores 6
Once you update the values, click on confirm and restart (this will restart the cluster).
I hope this helps.
https://learn.microsoft.com/en-us/azure/databricks/clusters/configure
If this answers your question, please consider accepting the answer by hitting the Accept answer and up-vote as it helps the community look for answers to similar questions.