How to upload large files in chunks using rest API, azure blob list storage from salesforce using LWC.

Sumit Gupta 0 Reputation points
2023-03-31T06:29:08.1233333+00:00

How to upload large files in chunks using rest API, azure blob list storage from salesforce using LWC.

I have tried to upload large files from the LWC component in chunks. But not find any call-back URL for uploading large files up to 4 GB to 10 GB from Rest API.

Can you please provide any links or documents for uploading large files in chunks from Rest API in Azure Blobs?

Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,488 questions
Azure
Azure
A cloud computing platform and infrastructure for building, deploying and managing applications and services through a worldwide network of Microsoft-managed datacenters.
993 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Sumarigo-MSFT 44,081 Reputation points Microsoft Employee
    2023-03-31T08:59:06.2866667+00:00

    @Sumit Gupta Welcome to Microsoft Q&A Forum, Thank you for posting your query here!

    Yes, you can upload large files in chunks using the Azure Blob Storage REST API. Here are the high-level steps you can follow:

    1. Divide the large file into smaller chunks. The maximum chunk size is 4 MB for the Azure Blob Storage REST API.
    2. Create a new blob in the Azure Blob Storage account using the "Put Blob" operation. This operation creates a new blob or updates an existing blob with the specified content.
    3. Upload each chunk of the file using the "Put Block" operation. This operation uploads a block of data to the specified block blob.
    4. Commit the blocks to the blob using the "Put Block List" operation. This operation commits the list of blocks that have been uploaded to the specified block blob.

    Here are some resources that can help you implement this solution:

    1. Azure Blob Storage REST API documentation: https://docs.microsoft.com/en-us/rest/api/storageservices/blob-service-rest-api
    2. Uploading large blobs in blocks: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet#upload-blobs-in-blocks
    3. Sample code for uploading large blobs in blocks using the Azure Blob Storage REST API: https://github.com/Azure-Samples/storage-blobs-dotnet-quickstart/blob/master/UploadBlobs.cs
    4. Salesforce Apex REST API documentation: https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/apex_rest.htm

    Additional information: We also recommended to use Azcopy tool to upload files from on-premises or cloud (Use this command-line tool to easily copy data to and blobs from Azure Blobs, Blob Files, and Table storage Storage with optimal performance. ) AzCopy supports concurrency and parallelism, and the ability to resume copy operations when interrupted. It provides high-performance for uploading, downloading larger files. Please consider using this library for larger files.

    Storage limits: https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/azure-subscription-service-limits#storage-limits


    Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.