I am uploading more than 100 MB of doc file on azure storage with rest api with node js, but I am getting timeout error. I tried multiple ways to upload via chunks but failed to do that, does any one can help me to do?

Ranjeet Kasture 0 Reputation points
2023-09-14T11:55:07.01+00:00

I am uploading more than 100 MB of doc file on azure storage with rest api with node js, but I am getting timeout error. I tried multiple ways to upload via chunks but failed to do that, does any one can help me to do?

Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,787 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Sumarigo-MSFT 45,781 Reputation points Microsoft Employee
    2023-09-20T12:07:39.61+00:00

    @Ranjeet Kasture Welcome to Microsoft Q&A Forum, Thank you for posting your query here!

    There are different ways you can upload large files to Blob.

    Block blobs are optimized for uploading large amounts of data efficiently. Block blobs are composed of blocks, each of which is identified by a block ID. A block blob can include up to 50,000 blocks. Each block in a block blob can be a different size, up to the maximum size permitted for the service version in use. To create or modify a block blob, write a set of blocks via the Put Block operation and then commit the blocks to a blob with the Put Block List operation.

    Blobs that are less than a certain size (determined by service version) can be uploaded in their entirety with a single write operation via Put Blob.

    215178-image.png

    Upload large amounts of random data in parallel to Azure storage

    If you want to upload larger files to file share or blob storage, there is an Azure Storage Data Movement Library. It provides high-performance for uploading, downloading larger files. Please consider using this library for larger files.

    Choose an Azure solution for data transfer : This article provides an overview of some of the common Azure data transfer solutions. The article also links out to recommended options depending on the network bandwidth in your environment and the size of the data you intend to transfer.

    You may be getting this error is because the maximum content size allowed for an uploadRange is 4MB. uploadRange operation maps to Put Range REST API operation and the limitation is from the REST API side (see description for Range or x-ms-range in request headers section).

    What you have to do is read chunks of your content and then call uploadRange method repeatedly using those chunks. When reading chunks, you have to ensure that the maximum size of the chunk you read does not exceed 4MB.

    Error when you write more than 4 MB of data to Azure Storage: Request body is too large

    Here are some resources that can help you implement this solution:

    1. Azure Blob Storage REST API documentation: https://docs.microsoft.com/en-us/rest/api/storageservices/blob-service-rest-api
    2. Uploading large blobs in blocks: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet#upload-blobs-in-blocks
    3. Sample code for uploading large blobs in blocks using the Azure Blob Storage REST API: https://github.com/Azure-Samples/storage-blobs-dotnet-quickstart/blob/master/UploadBlobs.cs
    4. Error when you write more than 4 MB of data to Azure Storage: Request body is too large

    If the issue still persist, please share the code and screenshot of the error message,
    Are you able to see network traffic at all with something like Fiddler tool and are you able to upload small files

    Please let us know if you have any further queries. I’m happy to assist you further.     


    Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.