@sunny I'm glad that you were able to resolve your issue and thank you for posting your solution so that others experiencing the same thing can easily reference this!
Issue:
You had a working code that downloads a big file (exceeding 2GB) from an AWS S3 pre-signed URL and then uploads it to an API in chunks with base64 encoded strings using Azure technology. The destination API required a
Content-Type: application/x-www-form-urlencoded
header. However, the entire process was slow and you observed that the buffer size was maxed at only 10kb but the desired buffer size was 5MB.
Solution:
You improved the performance by saving the downloaded content to a file stream and then reading them in chunks when uploading to the third-party API. Below is the key code part of this solution:
HttpResponseMessage responsedownload = await client.GetAsync(request.fileUrl, HttpCompletionOption.ResponseHeadersRead);
if (responsedownload.IsSuccessStatusCode)
{
string uploadresult = "";
string filepath = @"D:\home" + request.filename;
//Save to file stream then upload
using (Stream contentstream = await responsedownload.Content.ReadAsStreamAsync())
{
using (var fs = new FileStream(filepath, FileMode.OpenOrCreate))
{
contentstream.CopyTo(fs);
fs.Position = 0;
byte[] buffer = new byte[5000000];
int read;
string uploadfilename = request.filename;
string isNewfile = "true";
XmlDocument xmlDoc = new();
while ((read = fs.Read(buffer, 0, buffer.Length)) > 0)
{
string chunkstring = string.Empty;
using (MemoryStream ms = new MemoryStream())
{
ms.Write(buffer, 0, read);
byte[] uploadchunck = ms.ToArray();
chunkstring = (Convert.ToBase64String(uploadchunck));
}
//call upload api to upload above chunkstring
if (responseupload.IsSuccessStatusCode)
{
//process uploadresult
}
else
{
//throw error
}
uploadfilename = uploadresult;
isNewfile = "false";
}
}
//delet staging file
File.Delete(filepath);
}
return uploadresult;
}
else
//throw error
}