Slow upload to azure blob storage with python

Mikhael Abdallah de Oliveira Pinto 21 Reputation points
2023-01-06T15:07:13.983+00:00

The Api receive the file, than tries to create an unique blob name.
Than I upload in chunks of 4MB to the blob. Each chunk takes something about 8 seconds, is this normal? My upload speed is 110Mbps. I tried uploading a 50MB file and it took almost 2 minutes. I don't know if the azure_blob_storage version is related to this, I'm using azure-storage-blob==12.14.1

   import uuid  
       import os  
       from azure.storage.blob import BlobClient, BlobBlock, BlobServiceClient  
       import time  
       import uuid  
         
       @catalog_api.route("/catalog", methods=['POST'])  
       def catalog():  
           file = request.files['file']  
         
           url_bucket, file_name, file_type = upload_to_blob(file)  
         
         
       def upload_to_blob(self, file):  
           file_name = file.filename  
           file_type = file.content_type  
         
           blob_client = self.generate_blob_client(file_name)  
           blob_url = self.upload_chunks(blob_client, file)  
           return blob_url, file_name, file_type  
         
         
       def generate_blob_client(self, file_name: str):  
           blob_service_client = BlobServiceClient.from_connection_string(self.connection_string)  
           container_client = blob_service_client.get_container_client(self.container_name)  
         
           for _ in range(self.max_blob_name_tries):  
               blob_name = self.generate_blob_name(file_name)  
         
               blob_client = container_client.get_blob_client(blob_name)  
               if not blob_client.exists():  
                   return blob_client  
           raise Exception("Couldnt create the blob")  
         
         
       def upload_chunks(self, blob_client: BlobClient, file):  
           block_list=[]  
           chunk_size = self.chunk_size  
           while True:  
               read_data = file.read(chunk_size)  
         
               if not read_data:  
                   print("uploaded")  
                   break  
         
               print("uploading")  
               blk_id = str(uuid.uuid4())  
               blob_client.stage_block(block_id=blk_id,data=read_data)   
               block_list.append(BlobBlock(block_id=blk_id))  
         
           blob_client.commit_block_list(block_list)  
         
           return blob_client.url  
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
3,064 questions
{count} votes

Accepted answer
  1. SaiKishor-MSFT 17,326 Reputation points
    2023-01-12T00:07:58.6533333+00:00

    @Mikhael Abdallah de Oliveira Pinto Thanks for reaching out to Microsoft Q&A. I understand that you are seeing high latency when uploading to Azure Blob Storage with Python.

    Have you check the time taken in any other way for uploading such as via the Portal or via storage explorer and have you noticed it to be faster or same speed?

    The latency depends upon your location and the storage location and distance between them as well as your link speed and many other factors. Where are you uploading the data from and where is your storage account located?

    You can also test upload to Azure storage using this link- https://www.azurespeed.com/Azure/Upload

    Hope this helps. Please do let us know if you have any other questions in the meanwhile. Thank you!

    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.