How to use pagination in Azure Data factory with REST API

Ruben Dario Reyes Monsalve 41 Reputation points

I'm trying to retrieve the data from "Pipeline Runs - Query By Factory" from the Azure REST API using Azure Data factory REST activity, I'm stuck in the pagination step, I have not found any example about this other than the generic information provided in the documentation.

I have tested with different configurations but no luck so far, when I use the Headers.continuationToken $.continuationToken in the pagination parameters it apparently starts to move between pages but when it detects that there are more than 100 records, but instead of retrieving the next 100 or the rest of the the records, it starts to duplicate the existing values of the first loaded page, so for example if the pipeline runs for the specified time are 135 it retrieves more than 30K rows which makes no sense checking the data before canceling the debug pipeline shows more than 500 duplicated rows per pipeline run.

Any ideas would be really appreciated.


Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,928 questions
Microsoft Partner Center API
Microsoft Partner Center API
Microsoft Partner Center: A Microsoft website for partners that provides access to product support, a partner community, and other partner services.API: A software intermediary that allows two applications to interact with each other.
321 questions
0 comments No comments
{count} votes

Accepted answer
  1. svijay-MSFT 5,226 Reputation points Microsoft Employee

    Hello @Ruben Dario Reyes Monsalve

    Welcome to the Microsoft Q&A platform.

    Currently the Endpoint PipeLine runs - Query By Factory supports pagination through the Request Body


    The continuation token is taken as part of the request body and not through the Query Parameters / headers.

    That is the reason in the above scenario - the data (first page) kept repeating - as the Continuation token was not being honored as it was passed as headers.

    Unfortunately, at this point of time - pagination for this endpoint (Pipeline Runs - Query By Factory) cannot be achieved via headers/queryparameters/abosolute uri - which are the only supported form of pagination for the copy activity.

    Having said that - one workaround I can think of :

    Step 1 : ** Initialize a variable ContinuationToken as blank
    **Step 2 : ** Perform Copy activity with the ContinuationToken value (with merge behavior)
    **Step 3 :
    Perform Web activity
    Step 4 : Set the variable ContinuationToken with continuationtoken obtained from the Web activity.
    Step 5 : Repeat steps 2-4 using the Until activity - until the continuationtoken is not returned.

    The end goal is to perform the copy activity in an iterative way with a different Request body for getting next pages of the data until ContinuationToken is returned blank. The above is just for demonstration, you can optimize the logic as per your convenience.

    Note :
    When you are passing ContinuationToken in the response body. Ensure, you pass the LastUpdatedAfter and LastUpdatedBefore - these are mandatory parameters of the request body without which the data will be returned blank.

    Hope this will help. Please let us know if any further queries.

    • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification
    • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators
    2 people found this answer helpful.

1 additional answer

Sort by: Most helpful
  1. Ruben Dario Reyes Monsalve 41 Reputation points

    Thank you very much for the answer and explanation, it is very clear for me the next steps that I need to follow.


    0 comments No comments