Azure DataBricks ADF connectivity issue Delta Tables

Rish Shah 126 Reputation points


I am trying to pass a simple Select SQL query to query a DeltaTable in DataBricks using copy activity. My sink is SQL table.

I am able to preview the results of my query and am happy with that. However I am getting error when running the ADF Pipeline.. Please help:

Error code: 2200
Failure type: User configuration issue
Details: ErrorCode=AzureDatabricksCommandError,Hit an error when running the command in Azure Databricks. Error details: Failure to initialize configuration.


Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,845 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,070 questions
{count} vote

Accepted answer
  1. Rish Shah 126 Reputation points

    My colleague Siva was able to find a solution today. I am updating this in a hope that the solution will help someone:

    Basically, we need to enable staging to copy data from Deltalake. We need to update the spark config (go in databricks, spark cluster, edit, advanced. Leave whatever is in there as is, plus add the below) with the staging storage account key as per the comment below:

    Spark Config: Edit the Spark Config by entering the connection information for your Azure Storage account .
    This will allow your cluster to access the lab files. Enter the following:<STORAGE_ACCOUNT_NAME> <ACCESS_KEY>, where <STORAGE_ACCOUNT_NAME> is your Azure Storage account name, and <ACCESS_KEY> is your storage access key.

    Example: HD+91Y77b+TezEu1lh9QXXU2Va6Cjg9bu0RRpb/KtBj8lWQa6jwyA0OGTDmSNVFr8iSlkytIFONEHLdl67Fgxg==

    1 person found this answer helpful.

1 additional answer

Sort by: Most helpful
  1. brajesh jaishwal 26 Reputation points

    Any solution to this, I am facing the same issue