@Sumit Gupta Welcome to Microsoft Q&A Forum, Thank you for posting your query here!
Yes, you can upload large files in chunks using the Azure Blob Storage REST API. Here are the high-level steps you can follow:
- Divide the large file into smaller chunks. The maximum chunk size is 4 MB for the Azure Blob Storage REST API.
- Create a new blob in the Azure Blob Storage account using the "Put Blob" operation. This operation creates a new blob or updates an existing blob with the specified content.
- Upload each chunk of the file using the "Put Block" operation. This operation uploads a block of data to the specified block blob.
- Commit the blocks to the blob using the "Put Block List" operation. This operation commits the list of blocks that have been uploaded to the specified block blob.
Here are some resources that can help you implement this solution:
- Azure Blob Storage REST API documentation: https://docs.microsoft.com/en-us/rest/api/storageservices/blob-service-rest-api
- Uploading large blobs in blocks: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet#upload-blobs-in-blocks
- Sample code for uploading large blobs in blocks using the Azure Blob Storage REST API: https://github.com/Azure-Samples/storage-blobs-dotnet-quickstart/blob/master/UploadBlobs.cs
- Salesforce Apex REST API documentation: https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/apex_rest.htm
Additional information: We also recommended to use Azcopy tool to upload files from on-premises or cloud (Use this command-line tool to easily copy data to and blobs from Azure Blobs, Blob Files, and Table storage Storage with optimal performance. ) AzCopy supports concurrency and parallelism, and the ability to resume copy operations when interrupted. It provides high-performance for uploading, downloading larger files. Please consider using this library for larger files.
Storage limits: https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/azure-subscription-service-limits#storage-limits
- If you want to upload larger files to file share or blob storage, there is an Azure Storage Data Movement Library
- There are few similar SO threads discussion here https://stackoverflow.com/questions/41829911/azure-rest-api-put-blob https://stackoverflow.com/questions/61481720/upload-video-in-chunks-azure-blob-storage which provide some idea on your scenario. https://stackoverflow.com/questions/61857337/how-to-upload-a-large-file-in-chunks-with-parallelism-in-azure-sdk-v12
- You can use Azure Data Factory to copy the files from Azure Blob Storage to Salesforce. Please follow the below steps to copy the files from Azure Blob Storage to Salesforce
- Open Azure Data Factory Studio.
- Click on Manage. In the Linked Services, click on New, search for Salesforce and add your Salesforce Account.
- In the similar way, add a new Linked Service for Azure Blob Storage and add your Blob Storage.
- Now Click on Author. Click on +, select Copy Data tool.
- Add source as Azure Blob Storage and destination as Salesforce and fill the details.
Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.