How to write an ADF pipeline to copy data with for each loop select SQL to execute from metadata table

JasonW-5564 161 Reputation points
2023-10-25T04:12:11.7133333+00:00

I want to have a singular pipeline to copy data from a sql server db to BLOB storage in parquet format. I want the pipeline to loop through a table of that basically lists the connect string and SQL to execute for each table/parquet file I want to generate. I have found several examples that show looping through a list of tables, but none that will let me loop through a list of sql select statements to execute.

The metadata table containing the sql will look something like this with these columns: Server, DB_name, schema, table_name, sql command.

Any help is greatly appreciated!

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,992 questions
0 comments No comments
{count} votes

Accepted answer
  1. ShaikMaheer-MSFT 38,466 Reputation points Microsoft Employee
    2023-10-25T05:57:17.5433333+00:00

    Hi JasonW-5564,

    Thank you for posting query in Microsoft Q&A Platform.

    You need consider using parameterized linked service for SQL and parameterized dataset for SQL and parameterized dataset to BLOB storage to create and use them accordingly in your pipeline activities. Please check below detailed steps, which you need to consider achieving your requirement.

    Step1: Create a linked service for SQL DB with parameters in it, for Server Name, DB Name.

    Step2: Create a Dataset using Step1 linked service, with parameters in it for Schema & Table Name.

    Step3: Use Lookup activity or Script Activity to get data from metadata table.

    Step4: Use ForEach activity to loop through each item from Lookup activity output.

    Step5: Inside ForEach activity, use copy activity.

    Step6: In Copy Activity, under source tab, use Step2 dataset and select Query Option. And then pass SQL command to run it there. Also pass Server name, DB name, Schema and Table names accordingly to dataset parameters.

    Step7: In Copy Activity, under sink tab, use a dataset which created for BLOB storage with parameter for Filename. Pass File name dynamically. For example, tablename.parquet something like that.

    Consider checking below videos, to understand about parameterizing linked services and datasets.

    Parameterize Linked Services in Azure Data Factory

    Parameterize Datasets in Azure Data Factory

    Hope this helps. Please let me know if any further queries.


    Please consider hitting Accept Answer button. Accepted answers help community as well.

    1 person found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.