How to handle multiple different output streams in ADF

Feria Nji 40 Reputation points Microsoft Intern
2024-06-26T22:25:06.4633333+00:00

Hi team,
I have a scope script that calls a module with a function that has multiple output rowsets and I was wondering if this is supported in adf. Ultimately, we want to push the data to Kusto in the same different output streams without mixing them.

We also want to ensure that, if changes were to be made, only the script is touched and not the pipeline so ultimately the pipeline logic should not change if an extra stream were added. If there is a solution or work around to this, please let me know.

Thanks,
Feria

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,801 questions
{count} votes

Accepted answer
  1. Smaran Thoomu 16,555 Reputation points Microsoft Vendor
    2024-06-27T11:00:53.0066667+00:00

    Hi @Feria Nji

    Thanks for the question and using MS Q&A platform.
    Yes, it is possible to have a scope script that calls a module with a function that has multiple output rowsets in Azure Data Factory (ADF). You can use the "Stored Procedure" activity in ADF to call the scope script and specify the output datasets for each output rowset.

    To push the data to Kusto in different output streams without mixing them, you can create separate output datasets for each output rowset and specify them in the "Stored Procedure" activity. You can also use the "Copy Data" activity to copy the data from each output dataset to the corresponding Kusto table.

    To ensure that the pipeline logic does not change if an extra stream were added, you can create a dynamic output dataset in the scope script based on the output rowset name. This way, if a new output stream is added, you can simply update the scope script to create a new output dataset with the corresponding name and schema, and the pipeline logic will remain the same.

    Here are the general steps you can follow:

    • Create a scope script that calls a module with a function that has multiple output rowsets.
    • In the scope script, create a dynamic output dataset for each output rowset based on the output rowset name.
    • In the "Stored Procedure" activity in ADF, specify the scope script and the output datasets for each output rowset.
    • Use the "Copy Data" activity to copy the data from each output dataset to the corresponding Kusto table.
    • If a new output stream is added, update the scope script to create a new output dataset with the corresponding name and schema.

    I hope this helps! Let me know if you have any further questions or if you need any additional assistance.

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.