Design options for storing 4 MB in session state in ASP.NET Web App

Siegfried Heintze 1,861 Reputation points
2022-08-27T19:44:37.85+00:00

I'd like to prototype a web site where my prospective customers could upload 30-40 csv files (for about a total of 2 MB) and my VB.NET code will ingest this and compose a XML file for download that could be viewed in Microsoft Excel...

Since I don't have any funding yet, keeping the costs down is important...

This is easy to do with a Azure VM that has some built in NTFS file system space but I suspect using an Azure App Service Web app would be much cheaper...

So how do I accommodate the possibility of multiple users simultaneously uploading 30-40 csv files to their respective session state variables that does not preclude (should I get some funding) using the auto-scaling features of App Service or Kubernetes?

Where are the limits on session state variables defined? Is this an ASP.NET issue or an Azure App Web site parameter?

Hmm... blob storage is cheap but then I have to write the code to delete the blobs should the user get half way through uploading their data and abandon the session... Using an NTFS or linux filesystem storage has the same extra complexity...

What is your recommendation?

Hmmm... Maybe azure service bus or azure queue storage where each customer gets their own temporary queue and then I have an azure function (or web job?) delete the stale queues? I'm not sure this is simpler...

Thanks

Siegfried

If I could just store everything in session state variables, I think the logic would be much simpler...

ASP.NET Core
ASP.NET Core
A set of technologies in the .NET Framework for building web applications and XML web services.
4,161 questions
Azure App Service
Azure App Service
Azure App Service is a service used to create and deploy scalable, mission-critical web apps.
6,875 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Ryan Hill 25,666 Reputation points Microsoft Employee
    2022-09-02T02:51:39.16+00:00

    Hi @Siegfried Heintze ,

    I realize since you've asked this question a while ago, you've probably explored various methods of doing this file processing. Just in case you're still looking for a path forward, instead of maintaining session state with user file uploads, storing the files on NTFS drives for processing, and keeping track of it all; what I suggest using is an app service web front end and an Azure function API back end.

    Use Tutorial: Upload and analyze a file with Azure Functions and Blob Storage as a guide. With the app service, is just handing off the upload the request so you should be able to get away with a low resource S1 and function on a consumption plan; where you only pay for the compute.

    Again, using the above tutorial as a guide, I would do the following:

    Web App -> upload file to blob storage -> Azure function triggers off blob upload -> process the file for XML composition -> store the file in a separate blob container.

    The beauty of the consumption plan is that it will fan out and back in based on demand. To keep things organized and nice and tidy, the web end would set the container name for the client request which you can then pull from the trigger request.