Hi @Aswathi Kallinkeel
Based on the error message you're encountering:
Invalid configuration value detected for
fs.azure.account.key
usually indicates that the Databricks mount point is attempting to authenticate using a storage account key, but the configuration provided is either missing, incorrect, or conflicting with the linked service authentication method.
From your description, you are using:
- ADF with Access Token authentication for the linked service.
- Databricks mount points to ADLS, likely configured using
fs.azure.account.key
.
Root Cause
Databricks mount points created using fs.azure.account.key
rely on storage account keys, but your ADF linked service is configured with Access Token (OAuth). When ADF tries to interact with the mounted directory via Databricks, it does not have access to the expected key-based configuration, leading to the error.
Option 1: Use OAuth-based mount configuration in Databricks
If your organization prefers token-based authentication, consider re-mounting the ADLS location in Databricks using Azure Active Directory passthrough or service principal (OAuth), which aligns with ADF’s Access Token method.
Sample (OAuth) mount code:
configs = {
"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": "<app-id>",
"fs.azure.account.oauth2.client.secret": "<secret>",
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<tenant-id>/oauth2/token"
}
dbutils.fs.mount(
source = "abfss://<container>@<storageaccount>.dfs.core.windows.net/",
mount_point = "/mnt/<mount-name>",
extra_configs = configs)
Option 2: Update the mount point to use the correct storage account key
If you're committed to using storage account keys, ensure:
- The correct
fs.azure.account.key.<storageaccount>.dfs.core.windows.net
is configured in the cluster or passed in as part of the Spark configuration. - The key used is valid and active.
The test connection success in ADF only confirms the linked service setup, not the downstream Spark mount access. ADF does not manage Databricks cluster mount credentials - those are controlled within the Databricks environment.
For more information on the error please refer this: https://medium.com/@kyle.hale/troubleshooting-invalid-configuration-value-detected-for-fs-azure-account-key-6c6fcc67b217
I hope this information helps. Please do let us know if you have any further queries.
Kindly consider upvoting the comment if the information provided is helpful. This can assist other community members in resolving similar issues.