How to dynamically bring data from API to Azure Fuction (and later Azure Data Factory)
I am trying to find the way to bring in total data dynamically (not manually creating each Azure Function) from API into Azure Function and then to Azure Data Factory.
I am having an issue because of the maximum limitation of size of files at Azure Function and Azure Data Factory.
This is how data flows:
- Visual Studio (C#) - Azure Function solution (indicate Paging) --> Publish to Azure Function
- Azure Function
- Azure Data Factory
- Azure SQL Server
With the maximum number of rows for each Paging in API (for my case: 2,000 rows) and maximum size each Azure Function (I am using Consumption plan of Azure Function) could take (8,000 rows), I had to publish several Azure Functions (for now 4 separate C# file with 4 separate Azure Functions).
For example, for the total rows of 25,623, I could only do 8,000 rows each time, otherwise, error occurs due to the "maximum size" in Azure Function and Azure Data Factory. So, I cannot bring in too much data at the same time.
Bottom is an image of C# file where I had to bring in data from 8,000 rows --> 16,000 rows.
The issue is that number of total row (from 25,623) will increase over time, and I do not want to monitor the total number of rows and manually create a separate new Azure Function each time.
What is best approach to tackle this issues (to adjust/meet the maximum rows for each Function) and dynamically publish from Visual Solution file to Azure Function & Azure Data Factory to bring in total rows of data?
I guess challenge is that I would still have to dynamically indicate where each Paging starts (due to the maximum size of whole data) and publish dynamically. How can it be done?
So, if I could even go with more expense plan (not Consumption plan and let's say it will take a higher number of row for each take), is there any limitation of rows I could transfer into Azure Data Factory?
Thanks.