Hi @Feria Nji
Thanks for the question and using MS Q&A platform.
Yes, it is possible to have a scope script that calls a module with a function that has multiple output rowsets in Azure Data Factory (ADF). You can use the "Stored Procedure" activity in ADF to call the scope script and specify the output datasets for each output rowset.
To push the data to Kusto in different output streams without mixing them, you can create separate output datasets for each output rowset and specify them in the "Stored Procedure" activity. You can also use the "Copy Data" activity to copy the data from each output dataset to the corresponding Kusto table.
To ensure that the pipeline logic does not change if an extra stream were added, you can create a dynamic output dataset in the scope script based on the output rowset name. This way, if a new output stream is added, you can simply update the scope script to create a new output dataset with the corresponding name and schema, and the pipeline logic will remain the same.
Here are the general steps you can follow:
- Create a scope script that calls a module with a function that has multiple output rowsets.
- In the scope script, create a dynamic output dataset for each output rowset based on the output rowset name.
- In the "Stored Procedure" activity in ADF, specify the scope script and the output datasets for each output rowset.
- Use the "Copy Data" activity to copy the data from each output dataset to the corresponding Kusto table.
- If a new output stream is added, update the scope script to create a new output dataset with the corresponding name and schema.
I hope this helps! Let me know if you have any further questions or if you need any additional assistance.