Unable to copying from blob storage account to gen2 storage account between different subnets or vnets

suresh Reddy 41 Reputation points

Is Microsoft will start supporting copying from blob storage account to gen2 storage account between different subnets or vnets?

The target storage account is gen2 and we are trying to achieve via databricks job copy(python blob copy url) not via azcopy.

Reffering below link


Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,395 questions
Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
2,825 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,547 questions
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 82,356 Reputation points Microsoft Employee

    @suresh Reddy - Thanks for the question and using MS Q&A platform.

    Yes, it is possible to copy data from a blob storage account to a Gen2 storage account between different subnets or VNets.

    To achieve this, you need to create a private endpoint for the Gen2 storage account in your virtual network and associate it with a subnet. Once you have created the private endpoint, you can use it to access the Gen2 storage account from the blob storage account.

    You can use the Azure Blob Storage Python SDK to copy data from the blob storage account to the Gen2 storage account. Here is an example of how to copy data using the SDK:

    from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient
    # <span class=" active-doc-0" data-doc-items="0">Source blob storage account connection string[1](#doc-pos=0)</span>
    source_connection_string = "DefaultEndpointsProtocol=https;AccountName=<source_account_name>;AccountKey=<source_account_key>;EndpointSuffix=core.windows.net"
    # Target Gen2 storage account connection string
    target_connection_string = "DefaultEndpointsProtocol=https;AccountName=<target_account_name>;AccountKey=<target_account_key>;EndpointSuffix=dfs.core.windows.net"
    # Source container name
    source_container_name = "<source_container_name>"
    # Target container name
    target_container_name = "<target_container_name>"
    # Create a BlobServiceClient for the source blob storage account
    source_blob_service_client = BlobServiceClient.from_connection_string(source_connection_string)
    # Create a ContainerClient for the source container
    source_container_client = source_blob_service_client.get_container_client(source_container_name)
    # Create a BlobServiceClient for the target Gen2 storage account
    target_blob_service_client = BlobServiceClient.from_connection_string(target_connection_string)
    # Create a ContainerClient for the target container
    target_container_client = target_blob_service_client.get_container_client(target_container_name)
    # List the blobs in the source container
    blobs = source_container_client.list_blobs()
    # Copy each blob to the target container
    for blob in blobs:
        source_blob_client = source_container_client.get_blob_client(blob.name)
        target_blob_client = target_container_client.get_blob_client(blob.name)

    Note that you need to replace the placeholders in the connection strings, container names, and blob names with your own values.

    For more details, refer to Azure Storage Blobs client library for Python.

    Hope this helps. Do let us know if you any further queries.

    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful