Transfer file from S3 URL to Azure blob of size >100MB with Azure Logic Apps

Shelly Goel 36 Reputation points
2022-02-23T11:42:48.06+00:00

Hello,

We have a requirement to copy the the file of size upto 5GB using it's S3 pre-signed url to an Azure blob. We are doing it via having HTTP action to fetch the S3 url content and then passing this content in another create blob step. We have enabled chunking in both of these steps but still these steps fail and give an error as below:

"BadRequest. Http request failed as there is an error: 'Cannot write more bytes to the buffer than the configured maximum buffer size: 104857600"

Above steps work for size less than 1048MB.

Please suggest a good way to copy larger files from S3 url to Azure blob within Azure logic app

Thanks
Shelly

Azure Logic Apps
Azure Logic Apps
An Azure service that automates the access and use of data across clouds without writing code.
3,542 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. AnuragSingh-MSFT 21,546 Reputation points Moderator
    2022-02-25T07:21:21.843+00:00

    Hi @Shelly Goel ,

    Welcome to Microsoft Q&A! Thanks for posting the question.

    I understand that you are trying to move a large content (file) from S3 pre-signed URL to Azure Blob in Logic Apps. Based on the error being received, please note that to download chunked messages from an endpoint over HTTP, the endpoint must support partial content requests, or chunked downloads. Please refer to Download content in chunks

    You may use custom code with Logic Apps using one of the methods below to achieve it:

    1. Run code snippets by using inline code in Azure Logic Apps

    2. Trigger an Azure Function from Logic Apps

    The sample snippet below in .NET, based on this article can be included in the Function App which can perform the required operation (This is one of the ways to do it):

    string LargeFileURL = "<Spurce URL>;  
    string storageConnString = "<Storage Connection string to Azure Storage Account>"  
      
    string storageContainerName = "<destination container>"  
    string blobName = "<new blob name in Azure storage container>"  
      
    //BlobClient for saving/uploading content to Azure storage container as blob  
    BlobClient blobClient = new BlobClient(storageConnString, storageContainerName, blobName);  
      
    //HttpClient for getting content from source  
    var httpClient = new HttpClient();  
    var request = new HttpRequestMessage(HttpMethod.Get, LargeFileURL);  
    HttpResponseMessage response = await httpClient.GetAsync(sasLargeFile, HttpCompletionOption.ResponseHeadersRead);  
      
    using (var stream = await response.Content.ReadAsStreamAsync())  
    {  
        await blobClient.UploadAsync(stream);  
    }  
    

    Please let me know if you have any questions.

    ---
    Please 'Accept as answer' and ‘Upvote’ if it helped so that it can help others in the community looking for help on similar topics.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.