Pagination Issue in Power BI execute query API

Gandluri Deekshith Reddy 20 Reputation points
2024-03-22T05:46:19.7966667+00:00

Hi Team,

I am attempting to retrieve values from a table in Power BI via an API in Azure Data Factory. However, the table contains over 100,000 rows, and I've encountered a limitation with the API: it does not allow transfers exceeding 15 MB in size. Could you please assist me with this issue?
POST https://api.powerbi.com/v1.0/myorg/datasets/cfafbeb1-8037-4d0c-896e-a46fb27ff229/executeQueries{

User's image

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,624 questions
{count} votes

Accepted answer
  1. Nandan Hegde 36,146 Reputation points MVP Volunteer Moderator
    2024-03-22T07:17:17.37+00:00

    Please follow the below blog :

    https://datasharkx.wordpress.com/2023/03/02/overcoming-data-size-row-limitations-of-power-bi-rest-api-for-automated-data-extraction-from-dataset/

    that provides step by steps details to overcome the limitation by spliting the data in chunks

    1 person found this answer helpful.

2 additional answers

Sort by: Most helpful
  1. phemanth 15,755 Reputation points Microsoft External Staff Moderator
    2024-03-26T05:24:04.69+00:00

    @Deekshith Reddy

    The lack of a primary key in your dataset can potentially complicate working with large datasets in Azure Data Factory (ADF). Here's how to approach this issue in conjunction with the data size limitation of the Power BI API:

    1. Handling Large Data without a Primary Key:

    There are two main approaches to process large datasets in ADF without a primary key:

    • Chunking the Data: You can break down the data retrieval into smaller chunks that fit within the 15 MB limit. ADF allows you to define a chunking mechanism during data transfer. This way, you can process the data in smaller batches and avoid exceeding the size limit.

    Leveraging a Surrogate Key: If your scenario allows for modification of the data model, consider adding a surrogate key to your dataset. This could be an auto-incrementing integer column that uniquely identifies each row. Having a surrogate key simplifies data retrieval and manipulation, especially when dealing with large datasets.

    2. Choosing the Right Approach:

    • Chunking is ideal for: One-time data transfers or scenarios where modifying the data model is not feasible.
    • Surrogate key is ideal for: Repeated data transfers, improved query performance, and easier data manipulation within ADF.
    • Pagination with Chunking: If chunking alone isn't enough, explore using ADF's pagination feature along with chunking. This allows you to retrieve data in smaller chunks based on a specific date range or other criteria.

    4. Logic Apps and Data Factory:

    Logic Apps can be used to orchestrate workflows in Azure, but for data movement and manipulation, ADF is generally the preferred tool. You can potentially call ADF pipelines from within a Logic App to handle the data retrieval and chunking logic.

    By combining these techniques, you can effectively retrieve data from your Power BI table in Azure Data Factory even without a primary key, while staying within the API's size limitations. limitations.

    0 comments No comments

  2. phemanth 15,755 Reputation points Microsoft External Staff Moderator
    2024-04-10T04:35:47.9166667+00:00

    @Deekshith Reddy G Welcome to Microsoft Q&A platform and thanks for posting your question.

    I'm glad that you were able to resolve your issue and thank you for posting your solution so that others experiencing the same thing can easily reference this! Since the Microsoft Q&A community has a policy that "The question author cannot accept their own answer. They can only accept answers by others "I'll repost your solution in case you'd like to accept the answer.

    Ask:

    I am attempting to retrieve values from a table in Power BI via an API in Azure Data Factory. However, the table contains over 100,000 rows, and I've encountered a limitation with the API: it does not allow transfers exceeding 15 MB in size. Could you please assist me with this issue? POST https://api.powerbi.com/v1.0/myorg/datasets/cfafbeb1-8037-4d0c-896e-a46fb27ff229/executeQueries{

    User's image

    Solution: I have attempted to resolve the issue using logic app.

    If I missed anything please let me know and I'd be happy to add it to my answer, or feel free to comment below with any additional information.

    If you have any other questions, please let me know. Thank you again for your time and patience throughout this issue.


    Please don’t forget to Accept Answer and Yes for "was this answer helpful" wherever the information provided helps you, this can be beneficial to other community members.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.