Is there any way in ADF where we can have a sequential dependency upon the items within a forEach loop?

Swapnil Sarkar 0 Reputation points
2024-08-06T13:25:45.0633333+00:00

We want to create a pipeline where we want to have a sequential dependency on the items within a forEach loop.

Let's say we have 10 tables to iterate within a forEach loop, and each table depends on the previous successful run of its parent table. So is there any way we can achieve this?

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,643 questions
{count} votes

2 answers

Sort by: Most helpful
  1. Nandan Hegde 36,151 Reputation points MVP Volunteer Moderator
    2024-08-06T15:42:35.55+00:00

    Unfortunately there is no direct in built way to achieve that as in case of ADF for each iteration, it would continue to iterate across all iterations even if the prev iteration has failed.

    So you would have to have a custom logic wherein you have a variable created and defined at the end of each iteration.

    So the flow would be as follow:

    Default value of variable would be success

    At the start of foreach >> if activity to check whether the variable is success,

    if true>>proceed with copy activity and continue for next iteration

    if copy activity failed, update the varibale as failed and then all successive iterations would skip


  2. phemanth 15,765 Reputation points Microsoft External Staff Moderator
    2024-08-06T15:43:49.9733333+00:00

    @Swapnil Sarkar

    Thanks for using MS Q&A platform and posting your query.

    To achieve sequential dependency within a forEach loop in Azure Data Factory (ADF).

    Here’s how you can do it: Set the isSequential Property: When configuring your forEach activity, set the isSequential property to true. This ensures that the items within the loop are processed one after the other, rather than in parallel.

    Use a Nested Pipeline: Create a nested pipeline that handles the processing of each table. This nested pipeline can include activities that depend on the successful completion of the previous activity.

    Pass Parameters: Pass the necessary parameters from the outer pipeline to the nested pipeline. This way, each iteration can process the specific table it is responsible for.

    Here’s a basic example of how you can set this up:

    Outer Pipeline:

    • Use a Lookup activity to get the list of tables.
    • Use a forEach activity to iterate over the list of tables.
    • Set the isSequential property of the forEach activity to true.
    • Inside the forEach activity, use an Execute Pipeline activity to call the nested pipeline

    for more details:

    https://www.mssqltips.com/sqlservertip/6187/azure-data-factory-foreach-activity-example/

    https://stackoverflow.com/questions/63316284/how-to-create-iteration-scoped-variables-inside-foreach-activities-in-azure-dat

    To ensure that each table depends on the successful run of its parent table within a forEach loop in Azure Data Factory (ADF), you can follow these steps:

    Set Sequential Execution:

    • In the forEach activity, go to the Settings tab.
    • Check the Sequential checkbox to ensure that the items are processed one after the other.

    Use Success Conditions:

    • Inside the forEach loop, use activities like Execute Pipeline or Copy Data.
    • Set the Success condition on each activity to ensure that the next activity only runs if the previous one succeeds.

    Custom Dependency Logic:

    • If you need more control, you can use a combination of If Condition and Set Variable activities.
    • For each table, check the success of the previous table’s execution and proceed accordingly.

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.