How to upload 25GB files into Azure blob storage from an Azure App Services

Shiva Sadayan 41 Reputation points
2024-04-17T16:19:52.28+00:00

Hi All, my customer wants to upload a 25 GB zip file from a REACT application exposed through an application gateway.

 My initial thought was to use the BLOB SDK in the REACT app and upload it to an Azure storage account using a SAS token. However, this was declined due to security risks. 

 My project uses APIM, Application Gateway & Azure Data Factory.

Shall I create a public-facing storage account protected with AAD, chunk the files in the REACT app, and upload them directly into the storage account without going through the application gateway? Once the file is successfully uploaded, I can move it to a private storage account.

Can someone please point me in the right direction?

Thank you

Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,430 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,562 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. KarishmaTiwari-MSFT 18,527 Reputation points Microsoft Employee
    2024-04-18T01:53:40.2633333+00:00

    @Shiva Sadayan Since using a SAS token was declined and security is a top priority for your customer, your approach looks right to me.

    You can use Azure Blob Storage with Azure AD Authentication/Entra ID. Configure the REACT app to chunk the file and upload the chunks to the storage account using Azure AD authentication/Entra ID. This way, you can ensure that the upload is secure and authenticated without using SAS tokens.

    If the file size is a concern, consider using Azure Data Factory to upload the file. ADF can handle large file uploads efficiently and can be integrated with your REACT app and Azure Storage.

    Implement resumable uploads in your REACT app to handle interruptions or failures during the upload process. This can be done by breaking the file into smaller chunks and uploading them individually, then reassembling them on the server side.

    Ensure that your Azure Blob Storage account is secured with the appropriate access controls. Use Azure Key Vault to manage and retrieve any necessary secrets or keys securely.

    Once the file is successfully uploaded to the public-facing storage account, you can move it to a private storage account using Azure Data Factory or Azure Functions for additional security.

    0 comments No comments

  2. Shiva Sadayan 41 Reputation points
    2024-04-18T16:45:09.6466667+00:00

    @KarishmaTiwari-MSFT thank you for you response.

    My apologies for posting a loaded question.

    In my original post, I mentioned that this project utilizes an application gateway and API management. The REACT app is a public-facing application that uses the BLOB SDK to upload 25GB chunks to a storage account deployed outside the VNET.

    You might wonder why we took this approach. We decided on this because deploying the storage account within our VNET would mean network traffic to the storage account might need to be routed through the application gateway. I have been advised that this is not a best practice, as it can degrade the performance of the application gateway, especially when uploading large documents in chunks.

    Would you recommend publishing the storage account within the VNET and routing the traffic through the application gateway, or should it be published outside the VNET?

    If the upload is halfway through, the connection fails or times out, and the user attempts to upload again, will it replace the file? or do I need to delete the file before I resume again?

    Could you please give some pointers on using ADF from a REACT application to store blobs in a storage account?

    Thank you.