we have line of business to provide the files download functionality from azure storage to client accessing web application developed asp.net core mvc hosted in azure.
Downloading via Sas is working for fine only the storage account which is public. due to security compliance we have to mark the storage account under private after that donlaoding via sas url is not working
So I tried with another approach -> Download the content from container with Chunk and Parallel options and convert to it byte array and open it in the browser as file output.
It is working fine upto 400 MB ,post that throws out of memory exception
find below the code snippets
public async Task<byte[]> DownloadFilefromStoarage(string blobName)
{
try
{
var _connectionString = "XXX"
var _containerName = "XXX";
var blobServiceClient = new BlobServiceClient(_connectionString);
var containerClient = blobServiceClient.GetBlobContainerClient(_containerName);
var blobClient = containerClient.GetBlobClient(blobName);
var transferOptions = new StorageTransferOptions
{
// Set the maximum number of parallel transfer workers
MaximumConcurrency = 2,
// Set the initial transfer length to 8 MiB
InitialTransferSize = 8 * 1024 * 1024,
// Set the maximum length of a transfer to 4 MiB
MaximumTransferSize = 4 * 1024 * 1024
};
BlobDownloadToOptions downloadOptions = new BlobDownloadToOptions()
{
TransferOptions = transferOptions
};
var memoryStream = new MemoryStream();
await blobClient.DownloadToAsync(memoryStream, downloadOptions);
byte[] result = memoryStream.ToArray();
return result;
}
catch (Exception ex)
{
throw ex;
}
}