Azure databricks does not recognize HNS enabled storage account correctly

vivek 20 Reputation points
2024-11-28T17:19:01.46+00:00

Hi,

I am trying to create an external location in Azure Databricks, When i point it to a container in a storage account that has HNS enabled it still says that HNS is disabled. See image below. The storage credential being used has 'Storage Blob reader' and 'Storage blob contributor' permissions on the same storage account. I have tried with multiple storage accounts that have HNS enabled but still get the same message.
User's image

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,527 questions
{count} votes

2 answers

Sort by: Most helpful
  1. Amira Bedhiafi 33,631 Reputation points Volunteer Moderator
    2024-12-01T18:32:29.3266667+00:00

    I think you may have a problem with how the storage account and credentials are configured.

    So first step try to check if the storage account you're using has HNS explicitly enabled:

    • Go to the Azure Portal.
    • Navigate to the storage account.
    • Under Configuration, verify that Hierarchical namespace is set to Enabled.

    You already mentioned that the Storage Blob Data Reader and Storage Blob Data Contributor roles are assigned to the principal you're using (either user, service principal, or managed identity).

    Make sure that these roles must be assigned at the container level or storage account level.

    If you're using a managed identity or service principal, verify the configuration of the storage credential in Azure Databricks :

    dbutils.fs.mount(
        source="abfss://<container-name>@<storage-account-name>.dfs.core.windows.net/",
        mount_point="/mnt/<mount-name>",
        extra_configs={
            "fs.azure.account.auth.type.<storage-account-name>.dfs.core.windows.net": "OAuth",
            "fs.azure.account.oauth.provider.type.<storage-account-name>.dfs.core.windows.net": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
            "fs.azure.account.oauth2.client.id.<storage-account-name>.dfs.core.windows.net": "<client-id>",
            "fs.azure.account.oauth2.client.secret.<storage-account-name>.dfs.core.windows.net": "<client-secret>",
            "fs.azure.account.oauth2.client.endpoint.<storage-account-name>.dfs.core.windows.net": "https://login.microsoftonline.com/<tenant-id>/oauth2/token"
        }
    )
    

    Try listing the container's contents using PySpark or the Databricks CLI to verify connectivity status:

    # List files in the container
    spark.conf.set(
        "fs.azure.account.key.<storage-account-name>.dfs.core.windows.net", 
        "<storage-account-key>"
    )
    display(dbutils.fs.ls("abfss://<container-name>@<storage-account-name>.dfs.core.windows.net/"))
    

    https://learn.microsoft.com/en-us/azure/databricks/release-notes/


  2. Sahil Mahale 0 Reputation points
    2025-03-01T11:43:49.11+00:00

    If still this issue persist i will recommend you to follow below steps and check
    1.
    In your storage account using the IAM you need to add role assignment to the databricks connector which you created but this time go for the Privileged Administrator roles.User's image

    User's image

    1. In Privileged Administrator Roles try to go for Owner or Contributor role based on your requirement my recommendation is Owner as it will gave full access to the connector and then on next page select your databricks connector from the + select members option
      User's image

    User's image

    I am pretty sure it will work.
    Thanks

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.