How to stream the contents of a large zip file without filling up the Azure Storage 2gb memory limit

Marc Hedgley 185 Reputation points
2023-12-11T17:07:30.99+00:00

We are currently facing a challenge with our Azure Functions code and require some guidance to overcome a specific limitation.

Our scenario involves allowing clients to upload sizable zip files (ranging from 10GB to 50GB) to Azure Blob Storage using SFTP. Upon landing in the storage, we have implemented a function to stream the contents of the file for a format scan against a predefined whitelist. However, we've encountered an issue with the following code snippet:

using (MemoryStream blobMemStream = new MemoryStream())
                        {
                            await blob.DownloadToStreamAsync(blobMemStream);
 
                            using (ZipArchive archive = new ZipArchive(blobMemStream))
                            {

The function fails with the error "Stream was too long," which seems to be due to a 2GB memory limitation. We are reaching out for guidance on alternative methods to stream the contents of a zip file to a temporary storage container without encountering memory limitations.

Your assistance in this matter would be highly appreciated. If there are any best practices or alternative approaches we should consider, please advise.

Thank you for your time and support.

Azure Functions
Azure Functions
An Azure service that provides an event-driven serverless compute platform.
5,930 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
3,200 questions
{count} votes

Accepted answer
  1. Vinodh247 34,741 Reputation points MVP Volunteer Moderator
    2023-12-12T07:07:34.4533333+00:00

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.