ADF Debug pipeline : Use activity runtime

Veena 0 Reputation points
2024-07-11T10:27:54.8133333+00:00

What is the difference between 'Use data flow debug session' and 'Use activity runtime' in debug in Azure Data Factory?

I have a pipeline with a Lookup activity followed by a ForEach activity.

A lookup file is used to store parameters, and this file is passed to a ForEach activity.

The ForEach activity consists of another pipeline which has a dataflow. The compute size and core count are passed as parameters from the Look up file. The core count is defined as an integer value.

  • On running the debug with 'Use data flow debug session' option, the pipelines runs successfully. These are the compute parameter values passed as input to the data flow : "compute": {
        "coreCount": "16",
      
      
        "computeType": "'General'"
    
    },
  • Input to the inner pipeline [which contains the dataflow] :
        "p_compute_size": "'custom'",
      
      
        "p_compute_type": "'General'",
      
      
        "p_core_count": "16"
      ```However, the other option fails.
    
    
  • On running the debug with 'Use activity runtime', ForEach activity fails with the error : 'Failure type - User configuration issue'. The dataflow fails with the error : The request failed with status code '"BadRequest"'. The parameter passed is the same :
       "compute": {
      
      
          "coreCount": "16",
      
      
          "computeType": "'General'"
      
      
      },
      ```- Input to the inner pipeline [which contains the dataflow] :  
    
      ```python
        "p_compute_size": "'custom'",
      
      
        "p_compute_type": "'General'",
      
      
        "p_core_count": "16"
      ```This is how it is defined in the Lookup file :
    
    
		"computeSize": "custom",

		"computeType": "General",

		"coreCount": 16
```I don't understand why one option runs, but the other fails even with the exact value for parameters.

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,055 questions
{count} votes