-not sure if this might be a solution, but check once
How to stream the contents of a large zip file without filling up the Azure Storage 2gb memory limit
We are currently facing a challenge with our Azure Functions code and require some guidance to overcome a specific limitation.
Our scenario involves allowing clients to upload sizable zip files (ranging from 10GB to 50GB) to Azure Blob Storage using SFTP. Upon landing in the storage, we have implemented a function to stream the contents of the file for a format scan against a predefined whitelist. However, we've encountered an issue with the following code snippet:
using (MemoryStream blobMemStream = new MemoryStream())
{
await blob.DownloadToStreamAsync(blobMemStream);
using (ZipArchive archive = new ZipArchive(blobMemStream))
{
The function fails with the error "Stream was too long," which seems to be due to a 2GB memory limitation. We are reaching out for guidance on alternative methods to stream the contents of a zip file to a temporary storage container without encountering memory limitations.
Your assistance in this matter would be highly appreciated. If there are any best practices or alternative approaches we should consider, please advise.
Thank you for your time and support.
Azure Functions
Azure Blob Storage
-
Vinodh247 34,741 Reputation points MVP Volunteer Moderator
2023-12-12T07:07:34.4533333+00:00