Azure databricks does not recognize HNS enabled storage account correctly

vivek 0 Reputation points
2024-11-28T17:19:01.46+00:00

Hi,

I am trying to create an external location in Azure Databricks, When i point it to a container in a storage account that has HNS enabled it still says that HNS is disabled. See image below. The storage credential being used has 'Storage Blob reader' and 'Storage blob contributor' permissions on the same storage account. I have tried with multiple storage accounts that have HNS enabled but still get the same message.
User's image

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,258 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Amira Bedhiafi 27,051 Reputation points
    2024-12-01T18:32:29.3266667+00:00

    I think you may have a problem with how the storage account and credentials are configured.

    So first step try to check if the storage account you're using has HNS explicitly enabled:

    • Go to the Azure Portal.
    • Navigate to the storage account.
    • Under Configuration, verify that Hierarchical namespace is set to Enabled.

    You already mentioned that the Storage Blob Data Reader and Storage Blob Data Contributor roles are assigned to the principal you're using (either user, service principal, or managed identity).

    Make sure that these roles must be assigned at the container level or storage account level.

    If you're using a managed identity or service principal, verify the configuration of the storage credential in Azure Databricks :

    dbutils.fs.mount(
        source="abfss://<container-name>@<storage-account-name>.dfs.core.windows.net/",
        mount_point="/mnt/<mount-name>",
        extra_configs={
            "fs.azure.account.auth.type.<storage-account-name>.dfs.core.windows.net": "OAuth",
            "fs.azure.account.oauth.provider.type.<storage-account-name>.dfs.core.windows.net": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
            "fs.azure.account.oauth2.client.id.<storage-account-name>.dfs.core.windows.net": "<client-id>",
            "fs.azure.account.oauth2.client.secret.<storage-account-name>.dfs.core.windows.net": "<client-secret>",
            "fs.azure.account.oauth2.client.endpoint.<storage-account-name>.dfs.core.windows.net": "https://login.microsoftonline.com/<tenant-id>/oauth2/token"
        }
    )
    

    Try listing the container's contents using PySpark or the Databricks CLI to verify connectivity status:

    # List files in the container
    spark.conf.set(
        "fs.azure.account.key.<storage-account-name>.dfs.core.windows.net", 
        "<storage-account-key>"
    )
    display(dbutils.fs.ls("abfss://<container-name>@<storage-account-name>.dfs.core.windows.net/"))
    

    https://learn.microsoft.com/en-us/azure/databricks/release-notes/


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.