large data processing

Jeff green 21 Reputation points
2021-10-12T23:38:20.937+00:00

Hi,

I want to know, if i do get http call, and it is going to return me 30k records, then i can traverse through records one by one. If yes, how?

Also do we have inmemory datatable in azure function, which i can fill by ONE get http call. And then do filters on this data table to get specific record again and again in the process?

My requirements are:

  • Get static records in datatable through http get call from webservice
  • Loop through records around 30k. This will come from another http get call from web service.
  • Within loop, filter data table from first point and get specific record and perform calculation. Save the result in another result data table.
  • Finish While loop
  • Save record to database found in result data table.

Please guide me on achieving above requirements.

Thank you !

Loop through 30k records

Azure Functions
Azure Functions
An Azure service that provides an event-driven serverless compute platform.
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Samara Soucy - MSFT 5,141 Reputation points
    2021-10-13T23:28:34.72+00:00

    Just so I'm clear on what you are asking, is this question about whether or not Functions can handle holding 30k records in memory or are you looking for implementation pointers for a specific language? Each Functions instance on the Consumption plan has 1.5GB of memory available, so the size of these records will probably be the determining factor. Using a Premium or Dedicated SKU can give your function access to more memory depending on the size you pick. If you would like help with implementation, is there a specific language you are looking to work with?

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.