Error in Azure Databricks spark.conf.set

Nimesha Hansani Amarasinghe 20 Reputation points
2025-11-20T08:48:12.82+00:00

Hi everyone,

I am trying to configure a connection from Azure Databricks to Azure Data Lake Storage (ADLS). I have already provided the correct storage account name and access key, but I keep receiving the following error:

spark.conf.set( 
               "fs.azure.account.key."+adls_storage_name+".blob.core.windows.net", adls_storage_access_key)

[CONFIG_NOT_AVAILABLE] Configuration fs.azure.account.key.<accountname>.blob.core.windows.net is not available. SQLSTATE: 42K0I

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
0 comments No comments
{count} votes

3 answers

Sort by: Most helpful
  1. Q&A Assist
    2025-11-20T08:48:27.38+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    The error message you are encountering, Configuration fs.azure.account.key.<accountname>.blob.core.windows.net is not available, suggests that there may be an issue with how the configuration is being set for your Azure Data Lake Storage (ADLS) account.

    Here are a few things to check:

    1. Storage Account Type: Ensure that you are using the correct endpoint for your storage account. If you are connecting to Azure Data Lake Storage Gen2, you should be using the dfs.core.windows.net endpoint instead of blob.core.windows.net. The correct configuration would be:
         spark.conf.set(
             "fs.azure.account.key." + adls_storage_name + ".dfs.core.windows.net", adls_storage_access_key)
      
    2. Access Key: Double-check that the access key you are using is valid and has not expired. You can regenerate the access key from the Azure portal if needed.
    3. Cluster Configuration: Make sure that your Spark cluster is properly configured to access Azure storage. You may need to restart the cluster after making changes to the configuration.
    4. Permissions: Verify that the Databricks workspace has the necessary permissions to access the Azure storage account.

    If you follow these steps and ensure that you are using the correct endpoint and valid credentials, it should resolve the issue.


  2. PRADEEPCHEEKATLA 91,496 Reputation points Moderator
    2025-11-20T15:42:17.8166667+00:00

    Nimesha Hansani Amarasinghe - Thanks for the question and using MS Q&A platform.

    Update: This is an excepted behavior on the Databricks workspace with the unity catalog enabled by default.
    Note: Unity Catalog-enabled clusters, setting fs.azure.account.key is blocked for security and compliance. Instead, you must use Unity Catalog storage credentials and external locations.

    User's image

    This page describes how to create storage credentials in Unity Catalog to connect to Azure Data Lake Storage: https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/storage-credentials


    Your error occurs because the configuration key is incorrect for ADLS Gen2.

    Here’s the summary fix:

    For Azure Blob Storage, the correct key format is:

    spark.conf.set(
    f"fs.azure.account.key.{adls_storage_name}.blob.core.windows.net",
    adls_storage_access_key)
    

    For Azure Data Lake Storage Gen2 (ADLS) using ABFS, the key should be:

    spark.conf.set(
    f"fs.azure.account.key.{adls_storage_name}.dfs.core.windows.net",
    adls_storage_access_key)
    

    Why?

    • blob.core.windows.net → for Blob Storage
    • dfs.core.windows.net → for ADLS Gen2 (ABFS driver)

    If you are using ADLS Gen2, change blob.core.windows.net to dfs.core.windows.net.

    For more details, refer to document: https://learn.microsoft.com/en-us/azure/databricks/connect/storage/azure-storage#accountkey

    Hope this helps. Let me know if you have any further questions or need additional assistance. Also, if these answer your query, do click the "Upvote" and click "Accept the answer" of which might be beneficial to other community members reading this thread.


    𝘛𝘰 𝘴𝘵𝘢𝘺 𝘪𝘯𝘧𝘰𝘳𝘮𝘦𝘥 𝘢𝘣𝘰𝘶𝘵 𝘵𝘩𝘦 𝘭𝘢𝘵𝘦𝘴𝘵 𝘶𝘱𝘥𝘢𝘵𝘦𝘴 𝘢𝘯𝘥 𝘪𝘯𝘴𝘪𝘨𝘩𝘵𝘴 𝘰𝘯 𝘈𝘻𝘶𝘳𝘦 𝘋𝘢𝘵𝘢𝘣𝘳𝘪𝘤𝘬𝘴, 𝘥𝘢𝘵𝘢 𝘦𝘯𝘨𝘪𝘯𝘦𝘦𝘳𝘪𝘯𝘨, 𝘢𝘯𝘥 Data & AI 𝘪𝘯𝘯𝘰𝘷𝘢𝘵𝘪𝘰𝘯𝘴, 𝘧𝘰𝘭𝘭𝘰𝘸 𝘮𝘦 𝘰𝘯 𝘓𝘪𝘯𝘬𝘦𝘥𝘐𝘯.


  3. Swapnesh Panchal 1,370 Reputation points Microsoft External Staff Moderator
    2025-11-29T01:56:55.6166667+00:00

    Hi Nimesha Hansani Amarasinghe,
    Welcome to the Microsoft Q&A and thank you for posting your questions here.
    Why getting [CONFIG_NOT_AVAILABLE] Configuration fs.azure.account.key.<storage_account_name>.dfs.core.windows.net is not available on Databricks?

    This is expected behavior on Unity Catalog-enabled clusters. For security and compliance, Unity Catalog blocks settings like fs.azure.account.key. These configurations are stripped, so the ABFS driver cannot find them.

    Correct approach:

    • Do not use spark.conf.set with account keys.
    • Instead, create a Unity Catalog Storage Credential backed by a managed identity (recommended) or service principal, then define an External Location pointing to your ADLS Gen2 container.
    • Grant permissions on the external location to catalogs, schemas, or users, and access data via tables, volumes, or managed locations.

    Legacy pattern (fs.azure.account.key) works only on non-UC clusters. For UC-enabled workspaces, the supported method is Storage Credentials + External Locations.

    Docs: Create a storage credential for connecting to Azure Data Lake Storage [Create a s…soft Learn | Learn.Microsoft.com]


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.