Max size of System.IO.MemoryStream

August Asheim Birkeland 20 Reputation points
2023-02-02T10:46:56.6733333+00:00

Im trying to fetch a .bak file of 3GB size and upload to my blob storage using the SSH.NET package.

I get an System.IO.IOException: stream was too long error!

I cant find the max size of handeled streams for the System.IO.MemoryStream anywhere. Do anyone know?

This is my script:

using System;
using System.IO;
using System.Threading.Tasks;
using Renci.SshNet;
//using Microsoft.WindowsAzure.Storage;
using Azure.Storage.Blobs.Specialized;
using Azure.Storage.Blobs;
//using Microsoft.WindowsAzure.Storage.Auth;
//using Microsoft.WindowsAzure.Storage.Blob;

namespace SFTPtoBlob
{
    class Program
    {
        static void Main(string[] args)
        {
            Console.WriteLine("Fetching backup file from SFTP...");
            Task.Run(async () => await FetchFromSFTP()).Wait();
            Console.WriteLine("Done.");
        }

        private static async Task FetchFromSFTP()
        {
            var sftpClient = new SftpClient("", 22, "", "");
            sftpClient.Connect();

            var fileStream = new MemoryStream();
            sftpClient.DownloadFile(".bak", fileStream);
            fileStream.Seek(0, SeekOrigin.Begin);

            Console.WriteLine("Uploading backup file to Azure Blob Storage...");
            await UploadToBlobStorage(fileStream);
            Console.WriteLine("Done.");

            sftpClient.Disconnect();
        }

        private static async Task UploadToBlobStorage(Stream fileStream)
        {
            // var blob = container.GetPageBlobReference("<blob_name>");
            // await blob.UploadFromStreamAsync(fileStream);

            string connectionString = "";
            string containerName = "";
            string blobName = ".bak";

            // Get a reference to a container named "sample-container" and then create it
            BlobContainerClient container = new BlobContainerClient(connectionString, containerName);
            container.Create();

            // Get a reference to a blob named "sample-file" in a container named "sample-container"
            PageBlobClient blob = container.GetPageBlobClient(blobName);
            Console.WriteLine("MAX",blob.PageBlobMaxUploadPagesBytes);
            // Upload local file
            await blob.UploadPagesAsync(fileStream, 0, null);
        }
    }
}

Thanks in advance!

Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,415 questions
ASP.NET Core
ASP.NET Core
A set of technologies in the .NET Framework for building web applications and XML web services.
4,140 questions
{count} votes

1 answer

Sort by: Most helpful
  1. SaiKishor-MSFT 17,176 Reputation points
    2023-02-02T21:01:36.23+00:00

    @August Asheim Birkeland Thanks for reaching out to Microsoft Q&A.

    The maximum size of a MemoryStream object that can be handled by the System.IO.MemoryStream class is determined by the amount of available memory on the system. The maximum size of a MemoryStream object is 2 gigabytes (GB) by default.

    Reference docs- https://learn.microsoft.com/en-us/dotnet/api/system.io.memorystream.capacity?view=net-7.0

    https://learn.microsoft.com/en-us/dotnet/api/system.int32.maxvalue?view=net-7.0

    Please also refer to this thread that discusses a similar issue and a workaround- https://learn.microsoft.com/en-us/answers/questions/630079/how-to-store-2gb-above-size-file-in-file-stream-ta

    P.S: You can try also try using filestream instead of memorystream and see if that help you here.

    Hope this helps. Please let us know if you have any more questions and we will be glad to assist you further. Thank you!

    Remember:

    Please accept an answer if correct. Original posters help the community find answers faster by identifying the correct answer. Here is how.

    Want a reminder to come back and check responses? Here is how to subscribe to a notification.