An Apache Spark-based analytics platform optimized for Azure.
Hi Rishi Rithi This issue usually happens when the Standard_DS3_v2 VM size isn't available in the region you're trying to use. Azure VM availability can vary by region due to capacity and quota limits.
Here’s how you can check VM availability:
Run this command in Azure CLI to see where Standard_DS3_v2 is available:
az vm list-skus --size Standard_DS3_v2 --all --output table
More details here:
If the VM size is available in some regions but not in yours, you may need to:
- Deploy your Databricks workspace in a supported region: Supported Azure regions for Databricks
- Or use a different VM size like
Standard_DS2_v2that fits within your 10-core quota.
Also, if your region supports the VM but you’ve hit your quota, you can request an increase: https://learn.microsoft.com/en-us/azure/quotas/per-vm-quota-requests
Hope this helps. If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.