How to orchestrate pipelines and notebook?

Charbel Daia Martins 61 Reputation points


What would be the best practice for me to schedule pipelines created in different ADF and notebooks created in Data Bricks.

for example:

Run pipeline1 which is in ADF_A -> Run pileline1 which is in ADF_B -> Run notebook1 which is in Data Bricks

what would be the best form/resource within azure.


Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,903 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,464 questions
0 comments No comments
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 76,436 Reputation points Microsoft Employee

    Hello @Charbel Daia Martins ,

    Thanks for the question and using the MS Q&A platform.

    There are multiple ways of doing it.

    • Using Control flow activity - Execute Pipeline activity allows a Data Factory pipeline to invoke another pipeline.
    • Using Activity Dependency defines how subsequent activities depend on previous activities, determining the condition of whether to continue executing the next task. An activity can depend on one or multiple previous activities with different dependency conditions.
    • Using Tumbling window trigger dependency - You can create dependent pipelines in your ADF by adding dependencies among tumbling window triggers in your pipelines. By creating a dependency, you’re able to guarantee that a trigger is executed only after the successful execution of a dependent trigger in your data factory.

    Hope this helps. Do let us know if you any further queries.


    Please "Accept the answer" if the information helped you. This will help us and others in the community as well.

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful