ADF copy activity from ADLS to delta lake fails due to invalid configuration value detected

Aswathi Kallinkeel 0 Reputation points
2025-04-30T11:56:25.1533333+00:00

Hi, I have a copy activity in ADF to copy data from ADLS to Delta lake. I created linked services with authentication method as Access token. The test connection for linked service is successful. But when I try to execute my pipeline I am getting an error message mentioned in the screenshot.

we have mount points created in Databricks cluster to read and write data from/to ADLS.

Kindly help to resolve this issue.

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,507 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Smaran Thoomu 23,260 Reputation points Microsoft External Staff Moderator
    2025-04-30T13:28:24.3166667+00:00

    Hi @Aswathi Kallinkeel
    Based on the error message you're encountering:

    Invalid configuration value detected for fs.azure.account.key usually indicates that the Databricks mount point is attempting to authenticate using a storage account key, but the configuration provided is either missing, incorrect, or conflicting with the linked service authentication method.

    From your description, you are using:

    • ADF with Access Token authentication for the linked service.
    • Databricks mount points to ADLS, likely configured using fs.azure.account.key.

    Root Cause

    Databricks mount points created using fs.azure.account.key rely on storage account keys, but your ADF linked service is configured with Access Token (OAuth). When ADF tries to interact with the mounted directory via Databricks, it does not have access to the expected key-based configuration, leading to the error.

    Option 1: Use OAuth-based mount configuration in Databricks

    If your organization prefers token-based authentication, consider re-mounting the ADLS location in Databricks using Azure Active Directory passthrough or service principal (OAuth), which aligns with ADF’s Access Token method.

    Sample (OAuth) mount code:

    configs = {
      "fs.azure.account.auth.type": "OAuth",
      "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
      "fs.azure.account.oauth2.client.id": "<app-id>",
      "fs.azure.account.oauth2.client.secret": "<secret>",
      "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<tenant-id>/oauth2/token"
    }
    dbutils.fs.mount(
      source = "abfss://<container>@<storageaccount>.dfs.core.windows.net/",
      mount_point = "/mnt/<mount-name>",
      extra_configs = configs)
    

    Option 2: Update the mount point to use the correct storage account key

    If you're committed to using storage account keys, ensure:

    • The correct fs.azure.account.key.<storageaccount>.dfs.core.windows.net is configured in the cluster or passed in as part of the Spark configuration.
    • The key used is valid and active.

    The test connection success in ADF only confirms the linked service setup, not the downstream Spark mount access. ADF does not manage Databricks cluster mount credentials - those are controlled within the Databricks environment.

    For more information on the error please refer this: https://medium.com/@kyle.hale/troubleshooting-invalid-configuration-value-detected-for-fs-azure-account-key-6c6fcc67b217

    I hope this information helps. Please do let us know if you have any further queries.


    Kindly consider upvoting the comment if the information provided is helpful. This can assist other community members in resolving similar issues.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.