which is the best approach to upload 3GB files to azure blob from Azure SDK(on-premise) and how to avoid failure issues

Yallamaraju Goutham kumar 0 Reputation points
2023-02-27T11:28:13.5766667+00:00

which is the best approach to upload 3GB files to azure blob from Azure SDK(on-premise) with help of SAS token and how to avoid failure issues.

example: i am trying to upload 3GB file to azure blob from azure SDK(on-Premise) , 2GB got uploaded and remaining 1GB has to upload but......my network got disconnected, please provide which is the best approach to avoid such errors...

Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
3,201 questions
Azure
Azure
A cloud computing platform and infrastructure for building, deploying and managing applications and services through a worldwide network of Microsoft-managed datacenters.
1,465 questions
{count} votes

2 answers

Sort by: Most helpful
  1. Carlos Solís Salazar 18,196 Reputation points MVP Volunteer Moderator
    2023-02-27T12:38:17.8866667+00:00

    Thank you for asking this question on the Microsoft Q&A Platform.

    The easiest way to copy files from on-premises to a Blob Storage is to use the AzCopy command-line tool

    Also, to avoid errors during the copy you should use the option sync in the AzCopy command

    Hope this helps!


    Accept Answer and Upvote, if any of the above helped, this thread can help others in the community looking for remediation for similar issues.


  2. SaiKishor-MSFT 17,336 Reputation points
    2023-03-06T19:21:24.94+00:00

    @Yallamaraju Goutham kumar Thanks for reaching out to Microsoft Q&A.

    To upload large files to Azure Blob storage, you can use the Azure Blob storage REST API or the Azure Blob storage client library for .NET.

    To avoid failure issues during the upload process, you can use the following best practices:

    1. Use block blobs instead of page blobs for uploading large files. Block blobs are optimized for uploading large amounts of data, while page blobs are optimized for random read/write operations.
    2. Use the Azure Blob storage client library for .NET to upload the file. The client library provides a number of features that can help you avoid failure issues, such as automatic retries and progress tracking.
    3. Use a SAS token to secure the upload process. A SAS token is a secure way to grant limited access to your Azure Blob storage account.
    4. Use the Put Block and Put Block List operations to upload the file in blocks. This allows you to upload the file in smaller chunks, which can help you avoid failure issues if the network connection is lost.
    5. Use the Azure Blob storage client library's progress tracking feature to monitor the upload progress. This allows you to track the progress of the upload and resume the upload if it fails.
    6. Use the Azure Blob storage client library's retry policy to automatically retry the upload if it fails. This can help you avoid failure issues if the network connection is lost.
    7. Use the Azure Blob storage client library's parallel upload feature to upload multiple blocks in parallel. This can help you upload the file faster and avoid failure issues if the network connection is lost.

    You can find more information on how to upload large files to Azure Blob storage in the following document: https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/storage/blobs/storage-quickstart-blobs-dotnet.md

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.