Upload a block blob with Python
This article shows how to upload a blob using the Azure Storage client library for Python. You can upload data to a block blob from a file path, a stream, a binary object, or a text string. You can also upload blobs with index tags.
Prerequisites
- This article assumes you already have a project set up to work with the Azure Blob Storage client library for Python. To learn about setting up your project, including package installation, adding
import
statements, and creating an authorized client object, see Get started with Azure Blob Storage and Python. - The authorization mechanism must have permissions to perform an upload operation. To learn more, see the authorization guidance for the following REST API operations:
Upload data to a block blob
To upload a blob using a stream or a binary object, use the following method:
This method creates a new blob from a data source with automatic chunking, meaning that the data source may be split into smaller chunks and uploaded. To perform the upload, the client library may use either Put Blob or a series of Put Block calls followed by Put Block List. This behavior depends on the overall size of the object and how the data transfer options are set.
Upload a block blob from a local file path
The following example uploads a file to a block blob using a BlobClient
object:
def upload_blob_file(self, blob_service_client: BlobServiceClient, container_name: str):
container_client = blob_service_client.get_container_client(container=container_name)
with open(file=os.path.join('filepath', 'filename'), mode="rb") as data:
blob_client = container_client.upload_blob(name="sample-blob.txt", data=data, overwrite=True)
Upload a block blob from a stream
The following example creates random bytes of data and uploads a BytesIO
object to a block blob using a BlobClient
object:
def upload_blob_stream(self, blob_service_client: BlobServiceClient, container_name: str):
blob_client = blob_service_client.get_blob_client(container=container_name, blob="sample-blob.txt")
input_stream = io.BytesIO(os.urandom(15))
blob_client.upload_blob(input_stream, blob_type="BlockBlob")
Upload binary data to a block blob
The following example uploads binary data to a block blob using a BlobClient
object:
def upload_blob_data(self, blob_service_client: BlobServiceClient, container_name: str):
blob_client = blob_service_client.get_blob_client(container=container_name, blob="sample-blob.txt")
data = b"Sample data for blob"
# Upload the blob data - default blob type is BlockBlob
blob_client.upload_blob(data, blob_type="BlockBlob")
Upload a block blob with index tags
The following example uploads a block blob with index tags:
def upload_blob_tags(self, blob_service_client: BlobServiceClient, container_name: str):
container_client = blob_service_client.get_container_client(container=container_name)
sample_tags = {"Content": "image", "Date": "2022-01-01"}
with open(file=os.path.join('filepath', 'filename'), mode="rb") as data:
blob_client = container_client.upload_blob(name="sample-blob.txt", data=data, tags=sample_tags)
Upload a block blob with configuration options
You can define client library configuration options when uploading a blob. These options can be tuned to improve performance, enhance reliability, and optimize costs. The following code examples show how to define configuration options for an upload both at the method level, and at the client level when instantiating BlobClient. These options can also be configured for a ContainerClient instance or a BlobServiceClient instance.
Specify data transfer options for upload
You can set configuration options when instantiating a client to optimize performance for data transfer operations. You can pass the following keyword arguments when constructing a client object in Python:
max_block_size
- The maximum chunk size for uploading a block blob in chunks. Defaults to 4 MiB.max_single_put_size
- If the blob size is less than or equal tomax_single_put_size
, the blob is uploaded with a singlePut Blob
request. If the blob size is larger thanmax_single_put_size
or unknown, the blob is uploaded in chunks usingPut Block
and committed usingPut Block List
. Defaults to 64 MiB.
For more information on transfer size limits for Blob Storage, see Scale targets for Blob storage.
For upload operations, you can also pass the max_concurrency
argument when calling upload_blob. This argument defines the maximum number of parallel connections to use when the blob size exceeds 64 MiB.
The following code example shows how to specify data transfer options when creating a BlobClient
object, and how to upload data using that client object. The values provided in this sample aren't intended to be a recommendation. To properly tune these values, you need to consider the specific needs of your app.
def upload_blob_transfer_options(self, account_url: str, container_name: str, blob_name: str):
# Create a BlobClient object with data transfer options for upload
blob_client = BlobClient(
account_url=account_url,
container_name=container_name,
blob_name=blob_name,
credential=DefaultAzureCredential(),
max_block_size=1024*1024*4, # 4 MiB
max_single_put_size=1024*1024*8 # 8 MiB
)
with open(file=os.path.join(r'file_path', blob_name), mode="rb") as data:
blob_client = blob_client.upload_blob(data=data, overwrite=True, max_concurrency=2)
To learn more about tuning data transfer options, see Performance tuning for uploads and downloads with Python.
Set a blob's access tier on upload
You can set a blob's access tier on upload by passing the standard_blob_tier
keyword argument to upload_blob. Azure Storage offers different access tiers so that you can store your blob data in the most cost-effective manner based on how it's being used.
The following code example shows how to set the access tier when uploading a blob:
def upload_blob_access_tier(self, blob_service_client: BlobServiceClient, container_name: str, blob_name: str):
blob_client = blob_service_client.get_blob_client(container=container_name, blob=blob_name)
#Upload blob to the cool tier
with open(file=os.path.join(r'file_path', blob_name), mode="rb") as data:
blob_client = blob_client.upload_blob(data=data, overwrite=True, standard_blob_tier=StandardBlobTier.COOL)
Setting the access tier is only allowed for block blobs. You can set the access tier for a block blob to Hot
, Cool
, Cold
, or Archive
. To set the access tier to Cold
, you must use a minimum client library version of 12.15.0.
To learn more about access tiers, see Access tiers overview.
Upload a block blob by staging blocks and committing
You can have greater control over how to divide uploads into blocks by manually staging individual blocks of data. When all of the blocks that make up a blob are staged, you can commit them to Blob Storage.
Use the following method to create a new block to be committed as part of a blob:
Use the following method to write a blob by specifying the list of block IDs that make up the blob:
The following example reads data from a file and stages blocks to be committed as part of a blob:
def upload_blocks(self, blob_container_client: ContainerClient, local_file_path: str, block_size: int):
file_name = os.path.basename(local_file_path)
blob_client = blob_container_client.get_blob_client(file_name)
with open(file=local_file_path, mode="rb") as file_stream:
block_id_list = []
while True:
buffer = file_stream.read(block_size)
if not buffer:
break
block_id = uuid.uuid4().hex
block_id_list.append(BlobBlock(block_id=block_id))
blob_client.stage_block(block_id=block_id, data=buffer, length=len(buffer))
blob_client.commit_block_list(block_id_list)
Resources
To learn more about uploading blobs using the Azure Blob Storage client library for Python, see the following resources.
REST API operations
The Azure SDK for Python contains libraries that build on top of the Azure REST API, allowing you to interact with REST API operations through familiar Python paradigms. The client library methods for uploading blobs use the following REST API operations:
Code samples
Client library resources
See also
Feedback
Submit and view feedback for