Please follow the below blog :
that provides step by steps details to overcome the limitation by spliting the data in chunks
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Hi Team,
I am attempting to retrieve values from a table in Power BI via an API in Azure Data Factory. However, the table contains over 100,000 rows, and I've encountered a limitation with the API: it does not allow transfers exceeding 15 MB in size. Could you please assist me with this issue?
POST https://api.powerbi.com/v1.0/myorg/datasets/cfafbeb1-8037-4d0c-896e-a46fb27ff229/executeQueries{
Please follow the below blog :
that provides step by steps details to overcome the limitation by spliting the data in chunks
The lack of a primary key in your dataset can potentially complicate working with large datasets in Azure Data Factory (ADF). Here's how to approach this issue in conjunction with the data size limitation of the Power BI API:
1. Handling Large Data without a Primary Key:
There are two main approaches to process large datasets in ADF without a primary key:
Leveraging a Surrogate Key: If your scenario allows for modification of the data model, consider adding a surrogate key to your dataset. This could be an auto-incrementing integer column that uniquely identifies each row. Having a surrogate key simplifies data retrieval and manipulation, especially when dealing with large datasets.
2. Choosing the Right Approach:
4. Logic Apps and Data Factory:
Logic Apps can be used to orchestrate workflows in Azure, but for data movement and manipulation, ADF is generally the preferred tool. You can potentially call ADF pipelines from within a Logic App to handle the data retrieval and chunking logic.
By combining these techniques, you can effectively retrieve data from your Power BI table in Azure Data Factory even without a primary key, while staying within the API's size limitations. limitations.
@Deekshith Reddy G Welcome to Microsoft Q&A platform and thanks for posting your question.
I'm glad that you were able to resolve your issue and thank you for posting your solution so that others experiencing the same thing can easily reference this! Since the Microsoft Q&A community has a policy that "The question author cannot accept their own answer. They can only accept answers by others "I'll repost your solution in case you'd like to accept the answer.
Ask:
I am attempting to retrieve values from a table in Power BI via an API in Azure Data Factory. However, the table contains over 100,000 rows, and I've encountered a limitation with the API: it does not allow transfers exceeding 15 MB in size. Could you please assist me with this issue? POST https://api.powerbi.com/v1.0/myorg/datasets/cfafbeb1-8037-4d0c-896e-a46fb27ff229/executeQueries{
Solution: I have attempted to resolve the issue using logic app.
If I missed anything please let me know and I'd be happy to add it to my answer, or feel free to comment below with any additional information.
If you have any other questions, please let me know. Thank you again for your time and patience throughout this issue.
Please don’t forget to Accept Answer
and Yes
for "was this answer helpful" wherever the information provided helps you, this can be beneficial to other community members.