The error message you are encountering, Configuration fs.azure.account.key.<accountname>.blob.core.windows.net is not available, suggests that there may be an issue with how the configuration is being set for your Azure Data Lake Storage (ADLS) account.
Here are a few things to check:
- Storage Account Type: Ensure that you are using the correct endpoint for your storage account. If you are connecting to Azure Data Lake Storage Gen2, you should be using the
dfs.core.windows.netendpoint instead ofblob.core.windows.net. The correct configuration would be:spark.conf.set( "fs.azure.account.key." + adls_storage_name + ".dfs.core.windows.net", adls_storage_access_key) - Access Key: Double-check that the access key you are using is valid and has not expired. You can regenerate the access key from the Azure portal if needed.
- Cluster Configuration: Make sure that your Spark cluster is properly configured to access Azure storage. You may need to restart the cluster after making changes to the configuration.
- Permissions: Verify that the Databricks workspace has the necessary permissions to access the Azure storage account.
If you follow these steps and ensure that you are using the correct endpoint and valid credentials, it should resolve the issue.