I think you may have a problem with how the storage account and credentials are configured.
So first step try to check if the storage account you're using has HNS explicitly enabled:
- Go to the Azure Portal.
- Navigate to the storage account.
- Under Configuration, verify that Hierarchical namespace is set to Enabled.
You already mentioned that the Storage Blob Data Reader and Storage Blob Data Contributor roles are assigned to the principal you're using (either user, service principal, or managed identity).
Make sure that these roles must be assigned at the container level or storage account level.
If you're using a managed identity or service principal, verify the configuration of the storage credential in Azure Databricks :
dbutils.fs.mount(
source="abfss://<container-name>@<storage-account-name>.dfs.core.windows.net/",
mount_point="/mnt/<mount-name>",
extra_configs={
"fs.azure.account.auth.type.<storage-account-name>.dfs.core.windows.net": "OAuth",
"fs.azure.account.oauth.provider.type.<storage-account-name>.dfs.core.windows.net": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id.<storage-account-name>.dfs.core.windows.net": "<client-id>",
"fs.azure.account.oauth2.client.secret.<storage-account-name>.dfs.core.windows.net": "<client-secret>",
"fs.azure.account.oauth2.client.endpoint.<storage-account-name>.dfs.core.windows.net": "https://login.microsoftonline.com/<tenant-id>/oauth2/token"
}
)
Try listing the container's contents using PySpark or the Databricks CLI to verify connectivity status:
# List files in the container
spark.conf.set(
"fs.azure.account.key.<storage-account-name>.dfs.core.windows.net",
"<storage-account-key>"
)
display(dbutils.fs.ls("abfss://<container-name>@<storage-account-name>.dfs.core.windows.net/"))
https://learn.microsoft.com/en-us/azure/databricks/release-notes/