Authentication for Azure Databricks automation
In Azure Databricks, authentication refers to verifying an Azure Databricks identity (such as a user, service principal, or group). Azure Databricks uses credentials (such as an access token or a username and password) to verify the identity.
After Azure Databricks verifies the caller’s identity, Azure Databricks then uses a process called authorization to determine whether the verified identity has sufficient access permissions to perform the specified action on the resource at the given location. This article includes details only about authentication. It does not include details about authorization or access permissions; see Authentication and access control.
When a tool makes an automation or API request, it includes credentials that authenticate an identity with Azure Databricks. This article describes typical ways to create, store, and pass credentials and related information that Azure Databricks needs to authenticate and authorize requests. To learn which credential types, related information, and storage mechanism are supported by your tools, SDKs, scripts, and apps, see your provider’s documentation.
Databricks account and workspace REST APIs
Databricks organizes its Databricks REST API into two categories of APIs: account APIs and workspace APIs. Each of these categories requires different sets of information to authenticate the target Azure Databricks identity. Also, each supported Databricks authentication type requires additional information that uniquely identifies the target Azure Databricks identity.
For instance, to authenticate an Azure Databricks identity for calling Azure Databricks account-level API operations, you must provide:
- The target Azure Databricks account console URL, which is typically
https://accounts.azuredatabricks.net
. - The target Azure Databricks account ID. See Locate your account ID.
- Information that uniquely identifies the target Azure Databricks identity for the target Databricks authentication type. For the specific information to provide, see the section later in this article for that authentication type.
To authenticate an Azure Databricks identity for calling Azure Databricks workspace-level API operations, you must provide:
- The target Azure Databricks per-workspace URL, for example
https://adb-1234567890123456.7.azuredatabricks.net
. - Information that uniquely identifies the target Azure Databricks identity for the target Databricks authentication type. For the specific information to provide, see the section later in this article for that authentication type.
Databricks client unified authentication
Databricks provides a consolidated and consistent architectural and programmatic approach to authentication, known as Databricks client unified authentication. This approach helps make setting up and automating authentication with Databricks more centralized and predictable. It enables you to configure Databricks authentication once and then use that configuration across multiple Databricks tools and SDKs without further authentication configuration changes.
Participating Databricks tools and SDKs include:
- The Databricks CLI
- The Databricks Terraform provider
- Databricks Connect
- The Databricks extension for Visual Studio Code
- The Databricks SDK for Python
- The Databricks SDK for Java
- The Databricks SDK for Go
All participating tools and SDKs accept special environment variables as well as Azure Databricks configuration profiles for authentication. The Databricks Terraform provider and the Databricks SDKs for Python, Java, and Go also accept direct configuration of authentication settings within code. For details, see the following sections and the tool’s or SDK’s documentation.
The following sections contain examples of how to configure your machine for authentication by using special environment variables, Azure Databricks configuration profiles, Databricks Terraform provider code, and code for the Databricks SDKs for Python, Java, and Go. For other participating tools and SDKs:
- The Databricks CLI supports special environment variables as well as Azure Databricks configuration profiles for many authentication types. See the Environment and Profile examples in the following sections. See also Authentication for the Databricks CLI.
- Databricks Connect supports multiple authentication configuration options that are unique to Databricks Connect. See Set up the client in the Databricks Connect documentation.
- The Databricks extension for Visual Studio Code provides a unique user interface for configuring some authentication types and relies on its integration with Databricks Connect for some other authentication types. See Authentication for the Databricks extension for Visual Studio Code.
Azure Databricks personal access token authentication
Azure Databricks personal access tokens are one of the most well-supported types of credentials for resources and operations at the Azure Databricks workspace level. Many storage mechanisms for credentials and related information, such as environment variables and Azure Databricks configuration profiles, provide support for Azure Databricks personal access tokens. Although users can have multiple personal access tokens in an Azure Databricks workspace, each personal access token works for only a single Azure Databricks workspace. The number of personal access tokens per user is limited to 600 per workspace.
Note
To automate Azure Databricks account-level functionality, you cannot use Azure Databricks personal access tokens. Instead, you must use the Azure AD tokens of Azure Databricks account admins. Azure Databricks account admins can be users or service principals. For more information, see:
See also:
Azure Databricks personal access tokens for workspace users
To create an Azure Databricks personal access token for your Azure Databricks workspace user, do the following:
In your Azure Databricks workspace, click your Azure Databricks username in the top bar, and then select User Settings from the dropdown.
Click Developer.
Next to Access tokens, click Manage.
Click Generate new token.
(Optional) Enter a comment that helps you to identify this token in the future, and change the token’s default lifetime of 90 days. To create a token with no lifetime (not recommended), leave the Lifetime (days) box empty (blank).
Click Generate.
Copy the displayed token to a secure location, and then click Done.
Be sure to save the copied token in a secure location. Do not share your copied token with others. If you lose the copied token, you cannot regenerate that exact same token. Instead, you must repeat this procedure to create a new token. If you lose the copied token, or you believe that the token has been compromised, Databricks strongly recommends that you immediately delete that token from your workspace by clicking the X next to the token on the Access tokens page.
Note
If you are not able to create or use tokens in your workspace, this might be because your workspace administrator has disabled tokens or has not given you permission to create or use tokens. See your workspace administrator or the following:
Perform Azure Databricks personal access token authentication
To configure Azure Databricks personal access token authentication, you must set the following associated environment variables, .databrickscfg
fields, Terraform fields, or Config
fields:
- The Azure Databricks host, specified as the target Azure Databricks per-workspace URL, for example
https://adb-1234567890123456.7.azuredatabricks.net
- The Azure Databricks personal access token for the Azure Databricks user account.
To perform Azure Databricks personal access token authentication, integrate the following within your code, based on the participating tool or SDK:
Environment
To use environment variables with a tool or SDK, see the tool’s or SDK’s documentation. See also Environment variables and fields for client unified authentication and the Default order of evaluation for client unified authentication methods and credentials.
Set the following environment variables:
DATABRICKS_HOST
, set to the Azure Databricks per-workspace URL, for examplehttps://adb-1234567890123456.7.azuredatabricks.net
.DATABRICKS_TOKEN
Profile
Create or identify an Azure Databricks configuration profile with the following fields in your .databrickscfg
file. If you create the profile, replace the placeholders with the appropriate values. To use the profile with a tool or SDK, see the tool’s or SDK’s documentation. See also Environment variables and fields for client unified authentication and the Default order of evaluation for client unified authentication methods and credentials.
Set the following values in your .databrickscfg
file. In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
[<some-unique-configuration-profile-name>]
host = <workspace-url>
token = <token>
Terraform
provider "databricks" {
alias = "workspace"
}
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as HashiCorp Vault. See also Vault Provider). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
provider "databricks" {
alias = "workspace"
host = <retrieve-workspace-url>
token = <retrieve-token>
}
Python
from databricks.sdk import WorkspaceClient
w = WorkspaceClient()
# ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
from databricks.sdk import WorkspaceClient
w = WorkspaceClient(
host = retrieve_workspace_url(),
token = retrieve_token()
)
# ...
Java
import com.databricks.sdk.WorkspaceClient;
// ...
WorkspaceClient w = new WorkspaceClient();
// ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
import com.databricks.sdk.WorkspaceClient;
import com.databricks.sdk.core.DatabricksConfig;
// ...
DatabricksConfig cfg = new DatabricksConfig()
.setHost(retrieveWorkspaceUrl())
.setToken(retrieveToken());
WorkspaceClient w = new WorkspaceClient(cfg);
// ...
Go
import (
"github.com/databricks/databricks-sdk-go"
)
// ...
w := databricks.Must(databricks.NewWorkspaceClient())
// ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
import (
"github.com/databricks/databricks-sdk-go"
)
// ...
w := databricks.Must(databricks.NewWorkspaceClient(&databricks.Config{
Host: retrieveWorkspaceUrl(),
Token: retrieveToken(),
}))
// ...
OAuth user-to-machine (U2M) authentication
OAuth user-to-machine (U2M) authentication uses real-time human sign in and consent to authenticate the target Azure Databricks user account. After the user successfully signs in and consents to the OAuth authentication request, an OAuth token is given to the participating tool or SDK to perform token-based authentication from that time forward on the user’s behalf. The OAuth token has a lifespan of one hour, following which the tool or SDK involved will make an automatic background attempt to obtain a new token that is also valid for one hour.
Requirements for OAuth U2M authentication setup
You must be an administrator for the Azure Databricks account that corresponds to your Azure Databricks workspaces. See also Assign account admin roles to a user.
Azure Databricks relies on Azure app registrations to help authenticate Azure Databricks users to Azure Databricks accounts and workspaces for OAuth U2M. You must have an existing Azure app registration, configured as a web app with a redirect URI of
http://localhost:8020
, within the Azure tenant for your Azure Databricks account. If you do not have one, you must have permission to create an Azure app registration within the Azure tenant for your Azure Databricks account.
To create or identify an Azure app registration to act as a service principal, do the following:
- Use the Azure portal to sign in to the Azure tenant for your Azure Databricks account, at
https://portal.azure.com/<tenant-id>
. - Click App registrations. If App registrations is not visible, click More services and use the Filter services text box to search for App registrations.
- If you have an existing app registration, select its name in the list of app registrations. To create an app registration:
- Click New registration.
- Enter a Name for the app, and leave Supported account types set to Single tenant.
- Click Register.
- Add an authentication platform to the app: within Manage, click Authentication.
- Within Platform configurations, click Add a platform.
- Click Web.
- For Redirect URIs, enter
http://localhost:8020
. - Click Configure.
- Click Overview.
- Copy the following values:
- Copy the Application (client) ID value, as you will use it later as the Azure client ID.
- Copy the Directory (tenant) ID value, as you will use it later as the Azure tenant ID.
Configure Databricks client authentication
To finish configuring OAuth U2M authentication with Azure Databricks, you must set the following associated environment variables, .databrickscfg
fields, Terraform fields, or Config
fields:
The Azure Databricks host, specified as
https://accounts.azuredatabricks.net
for account operations or the target per-workspace URL, for examplehttps://adb-1234567890123456.7.azuredatabricks.net
for workspace operations.The Azure Databricks account ID, for Azure Databricks account operations.
The Azure tenant ID of the Azure app registration that is acting as a service principal.
The Azure client ID of the Azure app registration that is acting as a service principal.
The client secret of the Azure app registration that is acting as a service principal.
To perform OAuth U2M authentication authentication with Azure Databricks, integrate the following within your code, based on the participating tool or SDK. Note that depending on the Azure Databricks operations that your code calls, you do not necessarily need to be an administrator for the Azure Databricks account:
Environment
To use environment variables with a tool or SDK, see the tool’s or SDK’s documentation. See also Environment variables and fields for client unified authentication and the Default order of evaluation for client unified authentication methods and credentials.
For account-level operations, set the following environment variables:
DATABRICKS_HOST
, set to the value of your Azure Databricks account console URL,https://accounts.azuredatabricks.net
.DATABRICKS_ACCOUNT_ID
ARM_TENANT_ID
ARM_CLIENT_ID
ARM_CLIENT_SECRET
For workspace-level operations, set the following environment variables:
DATABRICKS_HOST
, set to the value of your Azure Databricks per-workspace URL, for examplehttps://adb-1234567890123456.7.azuredatabricks.net
.ARM_TENANT_ID
ARM_CLIENT_ID
ARM_CLIENT_SECRET
Profile
Create or identify an Azure Databricks configuration profile with the following fields in your .databrickscfg
file. If you create the profile, replace the placeholders with the appropriate values. To use the profile with a tool or SDK, see the tool’s or SDK’s documentation. See also Environment variables and fields for client unified authentication and the Default order of evaluation for client unified authentication methods and credentials.
For account-level operations, set the following values in your .databrickscfg
file. In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
[<some-unique-configuration-profile-name>]
host = <account-console-url>
account_id = <account-id>
azure_tenant_id = <azure-service-principal-subscription-id>
azure_client_id = <azure-service-principal-application-id>
azure_client_secret = <azure-service-principal-client-secret>
For workspace-level operations, set the following values in your .databrickscfg
file. In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
[<some-unique-configuration-profile-name>]
host = <workspace-url>
azure_tenant_id = <azure-service-principal-subscription-id>
azure_client_id = <azure-service-principal-application-id>
azure_client_secret = <azure-service-principal-client-secret>
Terraform
For account-level operations, you should first use the Azure CLI to authenticate the Azure service principal. See Get an Azure AD access token with the Azure CLI. Note that within these instructions, you do not need to run the az account get-access-token
command, as the Azure CLI automatically manages these access tokens for you.
For account-level operations, for default authentication:
provider "databricks" {
alias = "account"
}
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as HashiCorp Vault. See also Vault Provider). In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
provider "databricks" {
alias = "account"
host = <retrieve-account-console-url>
account_id = <retrieve-account-id>
azure_tenant_id = <retrieve-azure-tenant-id>
azure_client_id = <retrieve-azure-client-id>
azure_client_secret = <retrieve-azure-client-secret>
}
For workspace-level operations, you should first use the Azure CLI to authenticate the Azure service principal. See Get an Azure AD access token with the Azure CLI. Note that within these instructions, you do not need to run the az account get-access-token
command, as the Azure CLI automatically manages these access tokens for you.
For workspace-level operations, for default authentication:
provider "databricks" {
alias = "workspace"
}
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as HashiCorp Vault. See also Vault Provider). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
provider "databricks" {
alias = "workspace"
host = <retrieve-workspace-url>
azure_client_id = <retrieve-azure-client-id>
azure_tenant_id = <retrieve-azure-tenant-id>
azure_client_secret = <retrieve-azure-client-secret>
}
Python
Create a Flask application that implements the code within the flask_app_with_oauth.py example in the Databricks SDK for Python repository in GitHub. The Flask code example initiates OAuth U2M authentication and makes authenticated calls to the Databricks REST API.
Java
Use Spring Boot to implement the code within the spring-boot-oauth-u2m-demo example in the Databricks SDK for Java repository in GitHub. The Spring Boot code example initiates OAuth U2M authentication and makes authenticated calls to the Databricks REST API.
Azure MSI authentication
Azure MSI authentication uses Managed Service Identity (MSI) authentication, or system-assigned identity, to authenticate the target identity. See What are managed identities for Azure resources?
To configure Azure MSI authentication with Azure Databricks, you must set the following associated environment variables, .databrickscfg
fields, Terraform fields, or Config
fields:
- The Azure Databricks host.
- For account operations, specify
https://accounts.azuredatabricks.net
. - For workspace operations, specify the per-workspace URL, for example
https://adb-1234567890123456.7.azuredatabricks.net
.
- For account operations, specify
- For account operations, the Azure Databricks account ID.
- The Azure resource ID.
- Set Azure use MSI to true.
To perform Azure MSI authentication with Azure Databricks, integrate the following within your code, based on the participating tool or SDK:
Environment
To use environment variables with a tool or SDK, see the tool’s or SDK’s documentation. See also Environment variables and fields for client unified authentication and the Default order of evaluation for client unified authentication methods and credentials.
For account-level operations, set the following environment variables:
DATABRICKS_HOST
, set to the value of your Azure Databricks account console URL,https://accounts.azuredatabricks.net
.DATABRICKS_ACCOUNT_ID
ARM_USE_MSI
, set totrue
.
For workspace-level operations, set the following environment variables:
DATABRICKS_HOST
, set to the value of your Azure Databricks per-workspace URL, for examplehttps://adb-1234567890123456.7.azuredatabricks.net
.ARM_USE_MSI
, set totrue
.
For workspace-level operations, if the target identity has not already been added to the workspace, then specify DATABRICKS_AZURE_RESOURCE_ID
along with the Azure workspace resource ID, instead of HOST
along with the workspace URL. In this case, the target identity must have at least Contributor or Owner permissions on the Azure workspace resource.
Profile
Create or identify an Azure Databricks configuration profile with the following fields in your .databrickscfg
file. If you create the profile, replace the placeholders with the appropriate values. To use the profile with a tool or SDK, see the tool’s or SDK’s documentation. See also Environment variables and fields for client unified authentication and the Default order of evaluation for client unified authentication methods and credentials.
For account-level operations, set the following values in your .databrickscfg
file. In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
[<some-unique-configuration-profile-name>]
host = <account-console-url>
account_id = <account-id>
azure_use_msi = true
For workspace-level operations, set the following values in your .databrickscfg
file. In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
[<some-unique-configuration-profile-name>]
host = <workspace-url>
azure_use_msi = true
For workspace-level operations, if the target identity has not already been added to the workspace, then specify azure_workspace_resource_id
along with the Azure workspace resource ID, instead of host
along with the workspace URL. In this case, the target identity must have at least Contributor or Owner permissions on the Azure workspace resource.
Terraform
For account-level operations, for default authentication:
provider "databricks" {
alias = "accounts"
}
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as HashiCorp Vault. See also Vault Provider). In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
provider "databricks" {
alias = "accounts"
host = <retrieve-account-console-url>
account_id = <retrieve-account-id>
azure_use_msi = true
}
For workspace-level operations, for default authentication:
provider "databricks" {
alias = "workspace"
}
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as HashiCorp Vault. See also Vault Provider). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
provider "databricks" {
alias = "workspace"
host = <retrieve-workspace-url>
azure_use_msi = true
}
For workspace-level operations, if the target identity has not already been added to the workspace, then specify azure_workspace_resource_id
along with the Azure workspace resource ID, instead of host
along with the workspace URL. In this case, the target identity must have at least Contributor or Owner permissions on the Azure workspace resource.
Python
Note
The Databricks SDK for Python has not yet implemented Azure MSI authentication.
Java
Note
The Databricks SDK for Java has not yet implemented Azure MSI authentication.
Go
For account-level operations, for default authentication:
import (
"github.com/databricks/databricks-sdk-go"
)
// ...
a := databricks.Must(databricks.NewAccountClient())
// ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
import (
"github.com/databricks/databricks-sdk-go"
)
// ...
a := databricks.Must(databricks.NewAccountClient(&databricks.Config{
Host: retrieveAccountConsoleUrl(),
AccountId: retrieveAccountId(),
AzureUseMSI: true,
}))
// ...
For workspace-level operations, for default authentication:
import (
"github.com/databricks/databricks-sdk-go"
)
// ...
w := databricks.Must(databricks.NewWorkspaceClient())
// ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
import (
"github.com/databricks/databricks-sdk-go"
)
// ...
w := databricks.Must(databricks.NewWorkspaceClient(&databricks.Config{
Host: retrieveWorkspaceUrl(),
AzureUseMSI: true,
}))
// ...
For workspace-level operations, if the target identity has not already been added to the workspace, then specify AzureResourceID
along with the Azure workspace resource ID, instead of Host
along with the workspace URL. In this case, the target identity must have at least Contributor or Owner permissions on the Azure workspace resource.
Azure service principal authentication
Azure service principal authentication uses the credentials of an Azure service principal to authenticate. To create and manage service principals for Azure Databricks, see Provision a service principal for Azure Databricks automation - Azure Databricks UI.
To configure Azure service principal authentication with Azure Databricks, you must set the following associated environment variables, .databrickscfg
fields, Terraform fields, or Config
fields:
The Azure Databricks host.
For account operations, specify
https://accounts.azuredatabricks.net
.For workspace operations, specify the per-workspace URL, for example
https://adb-1234567890123456.7.azuredatabricks.net
.If the Azure service principal has not already been added to the workspace, then specify the Azure resource ID instead. In this case, the Azure service principal must have at least Contributor or Owner permissions on the Azure resource.
For account operations, the Azure Databricks account ID.
The Azure resource ID.
The tenant ID of the Azure service principal.
The client ID of the Azure service principal.
The client secret of the Azure service principal.
To perform Azure service principal authentication with Azure Databricks, integrate the following within your code, based on the participating tool or SDK:
Environment
To use environment variables with a tool or SDK, see the tool’s or SDK’s documentation. See also Environment variables and fields for client unified authentication and the Default order of evaluation for client unified authentication methods and credentials.
For account-level operations, set the following environment variables:
DATABRICKS_HOST
, set to the value of your Azure Databricks account console URL,https://accounts.azuredatabricks.net
.DATABRICKS_ACCOUNT_ID
ARM_TENANT_ID
ARM_CLIENT_ID
ARM_CLIENT_SECRET
For workspace-level operations, set the following environment variables:
DATABRICKS_HOST
, set to the value of your Azure Databricks per-workspace URL, for examplehttps://adb-1234567890123456.7.azuredatabricks.net
.ARM_TENANT_ID
ARM_CLIENT_ID
ARM_CLIENT_SECRET
For workspace-level operations, if the Azure service principal has not already been added to the workspace, then specify DATABRICKS_AZURE_RESOURCE_ID
along with the Azure workspace resource ID, instead of HOST
along with the workspace URL. In this case, the Azure service principal must have at least Contributor or Owner permissions on the Azure workspace resource.
Profile
Create or identify an Azure Databricks configuration profile with the following fields in your .databrickscfg
file. If you create the profile, replace the placeholders with the appropriate values. To use the profile with a tool or SDK, see the tool’s or SDK’s documentation. See also Environment variables and fields for client unified authentication and the Default order of evaluation for client unified authentication methods and credentials.
For account-level operations, set the following values in your .databrickscfg
file. In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
[<some-unique-configuration-profile-name>]
host = <account-console-url>
account_id = <account-id>
azure_tenant_id = <azure-service-principal-subscription-id>
azure_client_id = <azure-service-principal-application-id>
azure_client_secret = <azure-service-principal-client-secret>
For workspace-level operations, set the following values in your .databrickscfg
file. In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
[<some-unique-configuration-profile-name>]
host = <workspace-url>
azure_tenant_id = <azure-service-principal-subscription-id>
azure_client_id = <azure-service-principal-application-id>
azure_client_secret = <azure-service-principal-client-secret>
For workspace-level operations, if the Azure service principal has not already been added to the workspace, then specify azure_workspace_resource_id
along with the Azure workspace resource ID, instead of host
along with the workspace URL. In this case, the Azure service principal must have at least Contributor or Owner permissions on the Azure workspace resource.
Terraform
For account-level operations, for default authentication:
provider "databricks" {
alias = "accounts"
}
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as HashiCorp Vault. See also Vault Provider). In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
provider "databricks" {
alias = "accounts"
host = <retrieve-account-console-url>
account_id = <retrieve-account-id>
azure_tenant_id = <retrieve-azure-tenant-id>
azure_client_id = <retrieve-azure-client-id>
azure_client_secret = <retrieve-azure-client-secret>
}
For workspace-level operations, for default authentication:
provider "databricks" {
alias = "workspace"
}
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as HashiCorp Vault. See also Vault Provider). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
provider "databricks" {
alias = "workspace"
host = <retrieve-workspace-url>
azure_tenant_id = <retrieve-azure-tenant-id>
azure_client_id = <retrieve-azure-client-id>
azure_client_secret = <retrieve-azure-client-secret>
}
For workspace-level operations, if the Azure service principal has not already been added to the workspace, then specify azure_workspace_resource_id
along with the Azure workspace resource ID, instead of host
along with the workspace URL. In this case, the Azure service principal must have at least Contributor or Owner permissions on the Azure workspace resource.
Python
For account-level operations, for default authentication:
from databricks.sdk import AccountClient
a = AcccountClient()
# ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
from databricks.sdk import AccountClient
a = AccountClient(
host = retrieve_account_console_url(),
account_id = retrieve_account_id(),
azure_tenant_id = retrieve_azure_tenant_id(),
azure_client_id = retrieve_azure_client_id(),
azure_client_secret = retrieve_azure_client_secret()
)
# ...
For workspace-level operations, for default authentication:
from databricks.sdk import WorkspaceClient
w = WorkspaceClient()
# ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
from databricks.sdk import WorkspaceClient
w = WorkspaceClient(
host = retrieve_workspace_url(),
azure_tenant_id = retrieve_azure_tenant_id(),
azure_client_id = retrieve_azure_client_id(),
azure_client_secret = retrieve_azure_client_secret()
)
# ...
For workspace-level operations, if the Azure service principal has not already been added to the workspace, then specify azure_workspace_resource_id
along with the Azure workspace resource ID, instead of host
along with the workspace URL. In this case, the Azure service principal must have at least Contributor or Owner permissions on the Azure workspace resource.
Java
For account-level operations, for default authentication:
import com.databricks.sdk.AccountClient;
// ...
AccountClient a = new AccountClient();
// ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
import com.databricks.sdk.AccountClient;
import com.databricks.sdk.core.DatabricksConfig;
// ...
DatabricksConfig cfg = new DatabricksConfig()
.setHost(retrieveAccountConsoleUrl())
.setAccountId(retrieveAccountId())
.setAzureTenantId(retrieveAzureTenantId())
.setAzureClientId(retrieveAzureClientId())
.setAzureClientSecret(retrieveAzureClientSecret())
AccountClient a = new AccountClient(cfg);
// ...
For workspace-level operations, for default authentication:
import com.databricks.sdk.WorkspaceClient;
// ...
WorkspaceClient w = new WorkspaceClient();
// ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
import com.databricks.sdk.WorkspaceClient;
import com.databricks.sdk.core.DatabricksConfig;
// ...
DatabricksConfig cfg = new DatabricksConfig()
.setHost(retrieveWorkspaceUrl())
.setAzureTenantId(retrieveAzureTenantId())
.setAzureClientId(retrieveAzureClientId())
.setAzureClientSecret(retrieveAzureClientSecret())
WorkspaceClient w = new WorkspaceClient(cfg);
// ...
For workspace-level operations, if the Azure service principal has not already been added to the workspace, then specify setAzureWorkspaceResourceId
along with the Azure workspace resource ID, instead of setHost
along with the workspace URL. In this case, the Azure service principal must have at least Contributor or Owner permissions on the Azure workspace resource.
Go
For account-level operations, for default authentication:
import (
"github.com/databricks/databricks-sdk-go"
)
// ...
a := databricks.Must(databricks.NewAccountClient())
// ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
import (
"github.com/databricks/databricks-sdk-go"
)
// ...
a := databricks.Must(databricks.NewAccountClient(&databricks.Config{
Host: retrieveAccountConsoleUrl(),
AccountId: retrieveAccountId(),
AzureTenantId: retrieveAzureTenantId(),
AzureClientId: retrieveAzureClientId(),
AzureClientSecret: retrieveAzureClientSecret(),
}))
// ...
For workspace-level operations, for default authentication:
import (
"github.com/databricks/databricks-sdk-go"
)
// ...
w := databricks.Must(databricks.NewWorkspaceClient())
// ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
import (
"github.com/databricks/databricks-sdk-go"
)
// ...
w := databricks.Must(databricks.NewWorkspaceClient(&databricks.Config{
Host: retrieveWorkspaceUrl(),
AzureTenantId: retrieveAzureTenantId(),
AzureClientId: retrieveAzureClientId(),
AzureClientSecret: retrieveAzureClientSecret(),
}))
// ...
For workspace-level operations, if the Azure service principal has not already been added to the workspace, then specify AzureWorkspaceResourceId
along with the Azure workspace resource ID, instead of Host
along with the workspace URL. In this case, the Azure service principal must have at least Contributor or Owner permissions on the Azure workspace resource.
Azure CLI authentication
Azure CLI authentication uses the Azure CLI to authenticate the signed-in entity.
To configure Azure CLI authentication with Azure Databricks, you must have the Azure CLI installed locally. You must also set the following associated environment variables, .databrickscfg
fields, Terraform fields, or Config
fields:
- The Azure Databricks host.
- For account operations, specify
https://accounts.azuredatabricks.net
. - For workspace operations, specify the per-workspace URL, for example
https://adb-1234567890123456.7.azuredatabricks.net
.
- For account operations, specify
- For account operations, the Azure Databricks account ID.
To perform Azure CLI authentication with Azure Databricks, integrate the following within your code, based on the participating tool or SDK:
Environment
To use environment variables with a tool or SDK, see the tool’s or SDK’s documentation. See also Environment variables and fields for client unified authentication and the Default order of evaluation for client unified authentication methods and credentials.
For account-level operations, set the following environment variables:
DATABRICKS_HOST
, set to the value of your Azure Databricks account console URL,https://accounts.azuredatabricks.net
.DATABRICKS_ACCOUNT_ID
For workspace-level operations, set the following environment variables:
DATABRICKS_HOST
, set to the value of your Azure Databricks per-workspace URL, for examplehttps://adb-1234567890123456.7.azuredatabricks.net
.
Profile
Create or identify an Azure Databricks configuration profile with the following fields in your .databrickscfg
file. If you create the profile, replace the placeholders with the appropriate values. To use the profile with a tool or SDK, see the tool’s or SDK’s documentation. See also Environment variables and fields for client unified authentication and the Default order of evaluation for client unified authentication methods and credentials.
For account-level operations, set the following values in your .databrickscfg
file. In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
[<some-unique-configuration-profile-name>]
host = <account-console-url>
account_id = <account-id>
For workspace-level operations, set the following values in your .databrickscfg
file. In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
[<some-unique-configuration-profile-name>]
host = <workspace-url>
Terraform
For account-level operations, for default authentication:
provider "databricks" {
alias = "accounts"
}
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as HashiCorp Vault. See also Vault Provider). In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
provider "databricks" {
alias = "accounts"
host = <retrieve-account-console-url>
account_id = <retrieve-account-id>
}
For workspace-level operations, for default authentication:
provider "databricks" {
alias = "workspace"
}
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as HashiCorp Vault. See also Vault Provider). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
provider "databricks" {
alias = "workspace"
host = <retrieve-workspace-url>
}
Python
For account-level operations, for default authentication:
from databricks.sdk import AccountClient
a = AcccountClient()
# ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
from databricks.sdk import AccountClient
a = AccountClient(
host = retrieve_account_console_url(),
account_id = retrieve_account_id()
)
# ...
For workspace-level operations, for default authentication:
from databricks.sdk import WorkspaceClient
w = WorkspaceClient()
# ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
from databricks.sdk import WorkspaceClient
w = WorkspaceClient(host = retrieve_workspace_url())
# ...
Java
For account-level operations, for default authentication:
import com.databricks.sdk.AccountClient;
// ...
AccountClient a = new AccountClient();
// ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
import com.databricks.sdk.AccountClient;
import com.databricks.sdk.core.DatabricksConfig;
// ...
DatabricksConfig cfg = new DatabricksConfig()
.setHost(retrieveAccountConsoleUrl())
.setAccountId(retrieveAccountId())
AccountClient a = new AccountClient(cfg);
// ...
For workspace-level operations, for default authentication:
import com.databricks.sdk.WorkspaceClient;
// ...
WorkspaceClient w = new WorkspaceClient();
// ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
import com.databricks.sdk.WorkspaceClient;
import com.databricks.sdk.core.DatabricksConfig;
// ...
DatabricksConfig cfg = new DatabricksConfig()
.setHost(retrieveWorkspaceUrl())
WorkspaceClient w = new WorkspaceClient(cfg);
// ...
Go
For account-level operations, for default authentication:
import (
"github.com/databricks/databricks-sdk-go"
)
// ...
a := databricks.Must(databricks.NewAccountClient())
// ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the Azure Databricks account console URL is https://accounts.azuredatabricks.net
:
import (
"github.com/databricks/databricks-sdk-go"
)
// ...
a := databricks.Must(databricks.NewAccountClient(&databricks.Config{
Host: retrieveAccountConsoleUrl(),
AccountId: retrieveAccountId(),
}))
// ...
For workspace-level operations, for default authentication:
import (
"github.com/databricks/databricks-sdk-go"
)
// ...
w := databricks.Must(databricks.NewWorkspaceClient())
// ...
For direct configuration (replace the retrieve
placeholders with your own implementation to retrieve the values from the console or some other configuration store, such as Azure KeyVault). In this case, the host is the Azure Databricks per-workspace URL, for example https://adb-1234567890123456.7.azuredatabricks.net
:
import (
"github.com/databricks/databricks-sdk-go"
)
// ...
w := databricks.Must(databricks.NewWorkspaceClient(&databricks.Config{
Host: retrieveWorkspaceUrl(),
}))
// ...
Default order of evaluation for client unified authentication methods and credentials
Whenever a participating tool or SDK needs to authenticate with Azure Databricks, the tool or SDK tries the following types of authentication in the following order by default. When the tool or SDK succeeds with the type of authentication that it tries, the tool or SDK stops trying to authenticate with the remaining authentication types. To force an SDK to authenticate with a specific authentication type, set the Config
API’s Databricks authentication type field.
- Azure Databricks personal access token authentication
- OAuth user-to-machine (U2M) authentication
- Azure MSI authentication
- Azure service principal authentication
- Azure CLI authentication
For each authentication type that the participating tool or SDK tries, the tool or SDK tries to find authentication credentials in the following locations, in the following order. When the tool or SDK succeeds in finding authentication credentials that can be used, the tool or SDK stops trying to find authentication credentials in the remaining locations.
- Credential-related
Config
API fields (for SDKs). To setConfig
fields, see the SDK’s reference documentation. - Credential-related environment variables. To set environment variables, see your operating system’s documentation.
- Credential-related fields in the
DEFAULT
configuration profile within the.databrickscfg
file. To set configuration profile fields, see Azure Databricks configuration profiles. - Any related authentication credentials that are cached by the Azure CLI. See Azure CLI.
To provide maximum portability for your code, Databricks recommends that you create a custom configuration profile within the .databrickscfg
file, add the required fields for your target Databricks authentication type to the custom configuration profile, and then set the DATABRICKS_CONFIG_PROFILE
environment variable to the name of the custom configuration profile.
Environment variables and fields for client unified authentication
The following tables list the names and descriptions of the supported environment variables and fields for Databricks client unified authentication. In the following tables:
- Environment variable, where applicable, is the name of the environment variable. To set environment variables, see your operating system’s documentation.
.databrickscfg
field, where applicable, is the name of the field within an Azure Databricks configuration profiles file or Databricks Terraform configuration. To set.databrickscfg
fields, see Azure Databricks configuration profiles.- Terraform field, where applicable, is the name of the field within a Databricks Terraform configuration. To set Databricks Terraform fields, see Authentication in the Databricks Terraform provider documentation.
Config
field is the name of the field within theConfig
API for the specified SDK. To use theConfig
API, see the SDK’s reference documentation.
General host, token, and account ID environment variables and fields
Common name | Description | Environment variable | .databrickscfg field, Terraform field |
Config field |
---|---|---|---|---|
Azure Databricks host | (String) The Azure Databricks host URL for either the Azure Databricks workspace endpoint or the Azure Databricks accounts endpoint. | DATABRICKS_HOST |
host |
host (Python),setHost (Java),Host (Go) |
Azure Databricks token | (String) The Azure Databricks personal access token or Azure Active Directory (Azure AD) token. | DATABRICKS_TOKEN |
token |
token (Python),setToken (Java),Token (Go) |
Azure Databricks account ID | (String) The Azure Databricks account ID for the Azure Databricks account endpoint. Only has effect when the Azure Databricks host is also set tohttps://accounts.azuredatabricks.net . |
DATABRICKS_ACCOUNT_ID |
account_id |
account_id (Python),setAccountID (Java),AccountID (Go) |
Azure-specific environment variables and fields
Common name | Description | Environment variable | .databrickscfg field, Terraform field |
Config field |
---|---|---|---|---|
Azure client ID | (String) The Azure AD service principal’s application ID. | ARM_CLIENT_ID |
azure_client_id |
azure_client_id (Python),setAzureClientID (Java),AzureClientID (Go) |
Azure client secret | (String) The Azure AD service principal’s client secret. | ARM_CLIENT_SECRET |
azure_client_secret |
azure_client_secret (Python),setAzureClientSecret (Java),AzureClientSecret (Go) |
Azure environment | (String) The Azure environment type. Defaults to PUBLIC . |
ARM_ENVIRONMENT |
azure_environment |
azure_environment (Python),setAzureEnvironment (Java),AzureEnvironment (Go) |
Azure tenant ID | (String) The Azure AD service principal’s tenant ID. | ARM_TENANT_ID |
azure_tenant_id |
azure_tenant_id (Python),setAzureTenantID (Java),AzureTenantID (Go) |
Azure use MSI | (Boolean) True to use Azure Managed Service Identity passwordless authentication flow for service principals. Requires the Azure resource ID to also be set. | ARM_USE_MSI |
azure_use_msi |
AzureUseMSI (Go) |
Azure resource ID | (String) The Azure Resource Manager ID for the Azure Databricks workspace. | DATABRICKS_AZURE_RESOURCE_ID |
azure_workspace_resource_id |
azure_workspace_resource_id (Python),setAzureResourceID (Java),AzureResourceID (Go) |
.databrickscfg-specific environment variables and fields
Use these environment variables or fields to specify non-default settings for .databrickscfg
. See also Azure Databricks configuration profiles.
Common name | Description | Environment variable | Terraform field | Config field |
---|---|---|---|---|
.databrickscfg file path |
(String) A non-default path to the.databrickscfg file. |
DATABRICKS_CONFIG_FILE |
config_file |
config_file (Python),setConfigFile (Java),ConfigFile (Go) |
.databrickscfg default profile |
(String) The default named profile to use, other than DEFAULT . |
DATABRICKS_CONFIG_PROFILE |
profile |
profile (Python),setProfile (Java),Profile (Go) |
Authentication type field
Use this environment variable or field to force an SDK to use a specific type of Databricks authentication.
Common name | Description | Terraform field | Config field |
---|---|---|---|
Databricks authentication type | (String) When multiple authentication attributes are available in the environment, use the authentication type specified by this argument. | auth_type |
auth_type (Python),setAuthType (Java),AuthType (Go) |
Supported Databricks authentication type field values include:
databricks-cli
: OAuth user-to-machine (U2M) authenticationazure-msi
: Azure MSI authenticationazure-client-secret
: Azure service principal authenticationazure-cli
: Azure CLI authentication
Azure Databricks configuration profiles
An Azure Databricks configuration profile (sometimes refered to as a configuration profile, a config profile, or simply a profile
) contains settings and other information that Azure Databricks needs to authenticate. Azure Databricks configuration profiles are stored in Azure Databricks configuration profiles files for your tools, SDKs, scripts, and apps to use. To learn whether Azure Databricks configuration profiles are supported by your tools, SDKs, scripts, and apps, see your provider’s documentation. All participating tools and SDKs that implement Databricks client unified authentication support Azure Databricks configuration profiles.
To create an Azure Databricks configuration profiles file:
Use your favorite text editor to create a file named
.databrickscfg
in your~
(your user home) folder on Unix, Linux, or macOS, or your%USERPROFILE%
(your user home) folder on Windows, if you do not already have one. Do not forget the dot (.
) at the beginning of the file name. Add the following contents to this file:[<some-unique-name-for-this-configuration-profile>] <field-name> = <field-value>
In the preceding contents, replace the following values, and then save the file:
<some-unique-name-for-this-configuration-profile>
with a unique name for the configuration profile, such asDEFAULT
,DEVELOPMENT
,PRODUCTION
, or similar. You can have multiple configuration profiles in the same.databrickscfg
file, but each configuration profile must have a unique name within this file.<field-name>
and<field-value>
with the name and a value for one of the required fields for the target Databricks authentication type. For the specific information to provide, see the section earlier in this article for that authentication type.- Add a
<field-name>
and<field-value>
pair for each of the additional required fields for the target Databricks authentication type.
For example, for Azure Databricks personal access token authentication, the .databrickscfg
file might look like this:
[DEFAULT]
host = https://adb-1234567890123456.7.azuredatabricks.net
token = dapi123...
To create additional configuration profiles, specify different profile names within the same .databrickscfg
file. For example, to specify separate Azure Databricks workspaces, each with their own Azure Databricks personal access token:
[DEFAULT]
host = https://adb-1234567890123456.7.azuredatabricks.net
token = dapi123...
[DEVELOPMENT]
host = https://adb-2345678901234567.8.azuredatabricks.net
token = dapi234...
You can also specify different profile names within the .databrickscfg
file for Azure Databricks accounts and different Databricks authentication types, for example:
[DEFAULT]
host = https://adb-1234567890123456.7.azuredatabricks.net
token = dapi123...
[DEVELOPMENT]
azure_workspace_resource_id = /subscriptions/bc0cd1.../resourceGroups/my-resource-group/providers/Microsoft.Databricks/workspaces/my-workspace
azure_tenant_id = bc0cd1...
azure_client_id = fa0cd1...
azure_client_secret = aBC1D~...
ODBC DSNs
In ODBC, a data source name (DSN) is a symbolic name that tools, SDKs, scripts, and apps use to request a connection to an ODBC data source. A DSN stores connection details such as the path to an ODBC driver, networking details, authentication credentials, and database details. To learn whether ODBC DSNs are supported by your tools, scripts, and apps, see your provider’s documentation.
To install and configure the Databricks ODBC Driver and create an ODBC DSN for Azure Databricks, see ODBC driver.
JDBC connection URLs
In JDBC, a connection URL is a symbolic URL that tools, SDKs, scripts, and apps use to request a connection to a JDBC data source. A connection URL stores connection details such as networking details, authentication credentials, database details, and JDBC driver capabilities. To learn whether JDBC connection URLs are supported by your tools, SDKs, scripts, and apps, see your provider’s documentation.
To install and configure the Databricks JDBC Driver and create a JDBC connection URL for Azure Databricks, see JDBC driver.
Azure AD tokens
Azure Active Directory (Azure AD) tokens are one of the most well-supported types of credentials for Azure Databricks, both at the Azure Databricks workspace and account levels.
Note
Some tools, SDKs, scripts, and apps only support Azure Databricks personal access token authentication and not Azure AD tokens. To learn whether Azure AD tokens are supported by your tools, SDKs, scripts, and apps, see your provider’s documentation.
Azure AD token authentication for users
Databricks does not recommend that you create Azure AD tokens for Azure Databricks users manually. This is because each Azure AD token is short-lived, typically expiring within one hour. After this time, you must manually generate a replacement Azure AD token. Instead, use one of the participating tools or SDKs that implement the Databricks client unified authentication standard. These tools and SDKs automatically generate and replace expired Azure AD tokens for you, leveraging the following Databricks authentication types:
If you must manually create an Azure AD token for an Azure Databricks user, see:
Azure AD token authentication for service principals
Databricks does not recommend that you create Azure AD tokens for Azure AD service principals manually. This is because each Azure AD token is short-lived, typically expiring within one hour. After this time, you must manually generate a replacement Azure AD token. Instead, use one of the participating tools or SDKs that implement the Databricks client unified authentication standard. These tools and SDKs automatically generate and replace expired Azure AD tokens for you, leveraging the following Databricks authentication types:
If you must manually create an Azure AD token for an Azure AD service principal, see:
- Get an Azure AD access token with the Microsoft identity platform REST API
- Get an Azure AD access token with the Azure CLI
Azure CLI
The Azure CLI enables you to authenticate with Azure Databricks through PowerShell, through your terminal for Linux or macOS, or through your Command Prompt for Windows. To learn whether the Azure CLI is supported by your tools, SDKs, scripts, and apps, see your provider’s documentation.
To use the Azure CLI to authenticate with Azure Databricks manually, run the az login command:
az login
To authenticate by using an Azure service principal, see Azure CLI login with an Azure service principal.
To authenticate in by using an Azure Databricks user account, see Azure CLI login with an Azure Databricks user account.
Note that tools and SDKs that implement the Databricks client unified authentication standard and that rely on the Azure CLI should run the Azure CLI automatically on your behalf to create and manage Azure Databricks authentication.
Feedback
Submit and view feedback for