If you don’t have access to app registration, there are still a few ways to connect Azure Databricks to an Azure Storage account. You won’t be able to use service principals directly (which requires app registration), but you can leverage other options that don’t require admin-level privileges. Here are a few alternative methods:
If AAD passthrough is enabled in your Databricks environment, you can authenticate directly using your own credentials without needing an app registration. This allows Azure Databricks to authenticate with the storage account using your Azure AD identity.
You need to check with your administrator to ensure that Azure AD passthrough authentication is configured in your Databricks workspace.
So you need to use the following example code in a Databricks notebook to mount the storage account to DBFS:
# Configuration for the storage account
storage_account_name = "your_storage_account_name"
container_name = "your_container_name"
# Mount the storage account to DBFS
dbutils.fs.mount(
source=f"wasbs://{container_name}@{storage_account_name}.blob.core.windows.net/",
mount_point="/mnt/your-mount-point",
extra_configs={"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": "your-client-id",
"fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope="your-scope", key="your-secret"),
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/your-tenant-id/oauth2/token"}
)
After mounting, you can read and write files to Azure Blob Storage as if they were part of the Databricks File System (DBFS).
Or, you can also generate a Shared Access Signature (SAS) token for the storage account if you have permission. This doesn’t require app registration and is a simpler way to authenticate without needing Azure AD or a service principal.
Either request this from your administrator or generate it yourself if you have the necessary permissions in the Azure portal.
storage_account_name = "your_storage_account_name"
container_name = "your_container_name"
sas_token = "your_sas_token"
dbutils.fs.mount(
source=f"wasbs://{container_name}@{storage_account_name}.blob.core.windows.net/",
mount_point="/mnt/your-mount-point",
extra_configs={f"fs.azure.sas.{container_name}.{storage_account_name}.blob.core.windows.net": sas_token}
)
If you don’t want to mount the storage account, you can also directly read and write data using Azure SDKs (like Azure Blob Storage SDK) or Databricks native connectors.
from pyspark.sql import SparkSession
# Example using the storage account and SAS token
storage_account_name = "your_storage_account_name"
container_name = "your_container_name"
sas_token = "your_sas_token"
# Construct the URL with SAS token
url = f"wasbs://{container_name}@{storage_account_name}.blob.core.windows.net/your_file.csv?{sas_token}"
# Read the file into a DataFrame
df = spark.read.csv(url)
# Show the data
df.show()
If you have access to storage account keys (I don't recommended for production but okay for testing), you can use them to connect Databricks to the storage account.
Request this from your administrator or retrieve it from the Azure portal if you have access.
Mount Using Storage Key:
storage_account_name = "your_storage_account_name"
container_name = "your_container_name"
storage_account_key = "your_storage_account_key"
dbutils.fs.mount(
source=f"wasbs://{container_name}@{storage_account_name}.blob.core.windows.net/",
mount_point="/mnt/your-mount-point",
extra_configs={f"fs.azure.account.key.{storage_account_name}.blob.core.windows.net": storage_account_key}
)
For certain cases, you may need to request your administrator assistance, but these alternatives often provide adequate access without needing app registration.