How To Run an Azure Data Factory Pipeline Continuously

BizLight-9871 126 Reputation points
2022-03-04T15:16:21.89+00:00

I have a requirement to incrementally copy data from one SQL table to another SQL table. The watermark or key column is an Identity column. My boss wants me to restart the load as soon as it's done...and as you know, the completion time may vary. In Azure Data Factory, the trigger options are Scheduled, Tumbling Window and Custom Event. Does anyone know which option would allow me to achieve this continuous running of the pipeline and how to configure it?

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,625 questions
{count} votes

Accepted answer
  1. AnnuKumari-MSFT 34,556 Reputation points Microsoft Employee Moderator
    2022-03-04T17:14:30.853+00:00

    Hi @BizLight-9871 ,

    Thankyou for using Microsoft Q&A platform and posting your query.

    As per your query , it looks like you want to continuously run a pipeline . Once the last activity completes , it should retrigger the pipeline again.
    For this purpose, you can consider adding a copy activity at the end to copy a dummy file into a blob folder and add a delete activity at the start of the pipeline to delete if any file present in the same blob. Create a Event based trigger for the pipeline with Event as 'Blob created'.

    Please refer to the attached gif:

    180243-continuouslyrunpipeline.gif

    Hope this will help. Please let us know if any further queries.

    ------------------------------

    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you.
      Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators
    1 person found this answer helpful.

1 additional answer

Sort by: Most helpful
  1. AaronHughes 396 Reputation points
    2022-03-04T16:29:32.157+00:00

    this is a some what crazy ask of the system but there are probably methods to do this in ADF
    you could wrap the ADF pipe in something else to orchestrate the running for example logic apps have a control table that views and sees a completion status of the Pipe then kicks off again on the complete or not running id

    there are probably better ways to get replication from the source to destination - if these are sql server there are for sure better option (replication)

    1 person found this answer helpful.

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.