How to dynamically fetch specific partion /folder from synapse spark pool table

Ganesh Pathak 1 Reputation point


We are using big synapse sparkpool table which internally store data in parquet file.
Those files are stored in data lake based on the partition key columns.

We need to fetch this table in the ADF dataflow to dynamically filter based on other source system.
We could do with single value in source query by using variable (append Variable).
However we need to run new Dataflow instance for every single record.
Please suggest solution to dynamically filter source data for multiple value.


Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,671 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,135 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. MartinJaffer-MSFT 26,061 Reputation points

    Hello @Ganesh Pathak and welcome to Microsoft Q&A.

    I am confused why you need a dataflow for every variable. You can pass array type variable to dataflow. See below pictures.

    Inside the dataflow, declare parameter of appropriate array type:

    In pipeline, make array type variable:

    In pipeline's dataflow activity, pass the variable: