Azure Data Factory User Configuration Issues for Empty URI

Karthik Elavan 1 Reputation point
2023-03-17T10:24:07.7933333+00:00

I am developing the ADF pipelines with calling azure databricks notebook. in that notebook, I am passing the API url to notebook to execute our pyspark code. it is working normal debugging mode. if we try to moved DEV/Main branch with trigger now or scheduled trigger Job getting failed and not able to get proper error message.

We are getting user configuration issues error code 2011 and invalid URI and the URI is empty

kindly find attached screenshot and help me same. I am passing pipeline parameters to get value in data bricks notebook base parameter to passing value to inside the notebook. this is which I am doing now.

issues1

issue2

iss3

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,903 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,456 questions
{count} votes

3 answers

Sort by: Most helpful
  1. MartinJaffer-MSFT 26,011 Reputation points
    2023-03-20T18:10:02.2033333+00:00

    @Karthik Elavan Hello and welcome to Microsoft Q&A.

    As I understand, you have a pipeline in DEV environment. This pipeline uses parameter for name of notebook in Azure Databricks Notebook activity. In DEV this works well in DEBUG. In DEV does it work as Trigger Now, or Scheduled Trigger?

    I understand after moving from DEV to PROD, the sceduled trigger run reports error as URI is empty . Does DEBUG work in PROD?

    I am trying to separate 2 issues. Working DEBUG vs triggered. Working DEV vs PROD.

    The URI is empty suggests the notebook name is empty, which means the value is not getting passed in. Since the value is given to the pipeline from the trigger, we should look at the trigger definition.

    Find the trigger definition under "Manage" > "Triggers" > {}

    User's image

    In the trigger definition, look for "pipelines". This is the list of pipelines the trigger is used by. Find your pipeline by its "referenceName". Check for "parameters", as this details the parameter name & value. I suspect this is where the problem lies.

    User's image

    If you cannot find the parameter, or the value is blank, this is the cause of the problem.

    You can also get this info from the Pipeline authoring screen by "Trigger" > "New / Edit" > my trigger > number of parameters.

    User's image

    Let me know if this helps or not. There could be another cause, which is why I wanted to separate the 2 problems. Is the parameter being overwritten or erased when you go from DEV > PROD?


  2. Karthik Elavan 1 Reputation point
    2023-03-20T18:42:12.67+00:00

    it is working fine in debugger mode, we are parsing some parameter via spark cluster config parameter in linked service.


  3. Oliver Wolffe 0 Reputation points
    2023-04-26T23:27:20.09+00:00

    I am having a similar issue at the moment. I am in the process of investigating a fix.

    I think this might be related to having a deployment parameter. When you publish to elevated environments, if you don't specify that the cluster or Databricks workspace id should inherit the default value specified in your code, this will be null, hence uri is coming through as empty.

    When you add new parameters to your arm template parameters json, make sure to put "=" instead of "-". The equals sign will persist the parameter through as a default unless you override it during deployment.

    Or just add this value to the parameters you pass in with the template file during deployment.