ADF CopyActivity from delta lake to SQL

Prakash14 126 Reputation points
2022-10-13T05:06:01.84+00:00

Using ADF to run copy activity from azure delta lake( linked service) to Sql database. Getting the below error

ErrorCode=AzureDatabricksCommandError,Hit an error when running the command in Azure Databricks. Error details: Failure to initialize configurationInvalid configuration value detected for fs.azure.account.key
Caused by: Invalid configuration value detected for fs.azure.account.key.

where exactly we need to set up this configuration ? edited the cluster and gave in spark configs. It did not work.

fs.azure.account.auth.type.<storage-account-name>.dfs.core.windows.net OAuth
fs.azure.account.oauth.provider.type.<storage-account-name>.dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
fs.azure.account.oauth2.client.id.<storage-account-name>.dfs.core.windows.net <application-id>
fs.azure.account.oauth2.client.secret.<storage-account-name>.dfs.core.windows.net <service-credential>
fs.azure.account.oauth2.client.endpoint.<storage-account-name>.dfs.core.windows.net https://login.microsoftonline.com/<tenant-id>/oauth2/token

Also, gave spark.hadoop.fs.azure.account.key.<accountname>.dfs.core.windows.net as well and was not working.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,514 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,624 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Bhargava-MSFT 31,261 Reputation points Microsoft Employee Moderator
    2022-10-14T21:46:18.157+00:00

    Hello @Prakash14 ,
    Thank you for providing more details.
    You are getting this error because your databricks don't have access to Azure data lake storage gen2/Blob storage.

    Please follow this document to provide access to databricks cluster

    Access Azure Data Lake Storage Gen2 or Blob Storage using OAuth 2.0 with an Azure service principal

    service_credential = dbutils.secrets.get(scope="<scope>",key="<service-credential-key>")      
    spark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net", "OAuth")  
    spark.conf.set("fs.azure.account.oauth.provider.type.<storage-account>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")  
    spark.conf.set("fs.azure.account.oauth2.client.id.<storage-account>.dfs.core.windows.net", "<application-id>")  
    spark.conf.set("fs.azure.account.oauth2.client.secret.<storage-account>.dfs.core.windows.net", service_credential)  
    spark.conf.set("fs.azure.account.oauth2.client.endpoint.<storage-account>.dfs.core.windows.net", "https://login.microsoftonline.com/<directory-id>/oauth2/token")  
    

    or Access Azure Data Lake Storage Gen2 or Blob Storage using a SAS token or storage access key

    spark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net", "SAS")  
    spark.conf.set("fs.azure.sas.token.provider.type.<storage-account>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")  
    spark.conf.set("fs.azure.sas.fixed.token.<storage-account>.dfs.core.windows.net", "<token>")  
    

    I hope this helps. Please let me know if you have any further questions.

    ------------------------------

    • Please don't forget to click on 130616-image.png and upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.