Creating a pipeline for data transfer from SQL Server to blob storage

Vikranth-2626 100 Reputation points
2024-03-28T11:01:28.26+00:00

I'm looking to create a single pipeline that can transfer data from a SQL Server database to BLOB storage in a Parquet format. This pipeline should go through a table that lists the connection string and SQL commands needed for each table or Parquet file I want to create. I've seen examples of looping through tables, but none that allow me to loop through a list of SQL select statements.

The table containing the SQL commands will have columns like Server, Database Name, Schema, Table Name, and SQL Command. Thanks!

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,545 questions
{count} votes

Accepted answer
  1. Smaran Thoomu 9,610 Reputation points Microsoft Vendor
    2024-03-28T11:25:27.4066667+00:00

    Hi @Vikranth-2626

    Thank you for posting query in Microsoft Q&A Platform.

    You need consider using parameterized linked service for SQL and parameterized dataset for SQL and parameterized dataset to BLOB storage to create and use them accordingly in your pipeline activities. Please check below detailed steps, which you need to consider achieving your requirement.

    Step1: Create a linked service for SQL DB with parameters in it, for Server Name, DB Name.

    Step2: Create a Dataset using Step1 linked service, with parameters in it for Schema & Table Name.

    Step3: Use Lookup activity or Script Activity to get data from metadata table.

    Step4: Use ForEach activity to loop through each item from Lookup activity output.

    Step5: Inside ForEach activity, use copy activity.

    Step6: In Copy Activity, under source tab, use Step2 dataset and select Query Option. And then pass SQL command to run it there. Also pass Server name, DB name, Schema and Table names accordingly to dataset parameters.

    Step7: In Copy Activity, under sink tab, use a dataset which created for BLOB storage with parameter for Filename. Pass File name dynamically. For example, tablename.parquet something like that.

    Consider checking below videos, to understand about parameterizing linked services and datasets.

    Parameterize Linked Services in Azure Data Factory

    Parameterize Datasets in Azure Data Factory

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    0 comments No comments

0 additional answers

Sort by: Most helpful