Using Dynamic Job Parameters with Databricks Job Activity

Víctor C. Fernández 0 Reputation points
2025-06-30T08:25:18.73+00:00

A replacement is being implemented for a generic Databricks job trigger pipeline with the new Databricks job activity. The goal is to maintain all job triggering logic in a separate pipeline. Currently, a JSON-like string is being dynamically passed with all parameters required for triggering the pipeline, which could instead be converted into an object. The intention is to dynamically define the jobParameters while replacing the logic of calling the Databricks API endpoint.

When attempting to pass the following to the jobParameters in the activity:

"jobParameters": "@pipeline().parameters.dynamicJobParameters"

It gets destructured into the following format:

"jobParameters": {
                        "0": "@",
                        "1": "p",
                        "2": "i",
                        "3": "p",
                        "4": "e",
                        "5": "l",
                        "6": "i",
                        "7": "n",
                        "8": "e",
                        "9": "(",
                        "10": ")",
                        "11": ".",
                        "12": "p",
                        "13": "a",
                        "14": "r",
                        "15": "a",
                        "16": "m",
                        "17": "e",
                        "18": "t",
                        "19": "e",
                        "20": "r",
                        "21": "s",
                        "22": ".",
                        "23": "d",
                        "24": "y",
                        "25": "n",
                        "26": "a",
                        "27": "m",
                        "28": "i",
                        "29": "c",
                        "30": "J",
                        "31": "o",
                        "32": "b",
                        "33": "P",
                        "34": "a",
                        "35": "r",
                        "36": "a",
                        "37": "m",
                        "38": "e",
                        "39": "t",
                        "40": "e",
                        "41": "r",
                        "42": "s",
                        "seed": 18692363
                    }

Are there thoughts or workarounds available for this behavior? Additionally, is there any information on whether this issue is intended to be addressed once the activity is out of preview?

Thanks!

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,547 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Venkat Reddy Navari 3,630 Reputation points Microsoft External Staff Moderator
    2025-06-30T11:53:26.0933333+00:00

    Hi @Víctor C. Fernández What you're seeing with the jobParameters getting broken into individual characters usually happens when the expression is passed as a string instead of being evaluated properly. In your case, the line:

    
    "jobParameters": "@pipeline().parameters.dynamicJobParameters"
    

    is being treated literally as a plain string rather than being interpreted as an expression.

    Here’s what you can try:

    Instead of hardcoding the value as a string, open the dynamic content editor (click the fx button next to the jobParameters field), and enter this directly as an expression:

    
    @pipeline().parameters.dynamicJobParameters
    

    This will make sure the value is evaluated at runtime, and passed as an object, not split up character by character.

    Important: Make sure that the dynamicJobParameters you're passing into the pipeline is already a proper JSON object. If you’re passing a JSON string, you’ll want to parse it first using the json() function:

    
    @json(pipeline().parameters.dynamicJobParameters)
    
    

    About the preview status:

    Since the new Databricks Job activity is still in preview, it’s not uncommon to run into unexpected behavior like this. This seems to be working as designed for now, but it’s worth keeping an eye on the Data Factory release notes or Azure updates for any changes as the feature progresses toward general availability.


    Hope this helps. If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.