Share via

Copying Bulk Data from DEV to PROD in Azure Blob Storage

Keerthana, D 40 Reputation points
Oct 28, 2024, 4:24 PM

I have a requirement to copy bulk data in blob storage from one environment to another. Can anyone suggest some methods and tactics to accomplish this task?

Azure Storage Explorer
Azure Storage Explorer
An Azure tool that is used to manage cloud storage resources on Windows, macOS, and Linux.
278 questions
Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
3,393 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
3,100 questions
{count} votes

Accepted answer
  1. Keshavulu Dasari 3,790 Reputation points Microsoft Vendor
    Oct 28, 2024, 8:17 PM

    Hi Keerthana, D,
    Welcome to Microsoft Q&A Forum, thank you for posting your query here!
    Copying bulk data between environments in Azure Blob Storage can be efficiently managed using several methods
    1. AzCopy

    AzCopy is a command-line utility designed for copying data to and from Azure Blob Storage. It supports copying blobs, directories, and containers between storage accounts. Here’s a basic example of how to use AzCopy:

    JSON
    azcopy copy 'https://<source-storage-account>.blob.core.windows.net/<container>/<blob-path>?<SAS-token>' 'https://<destination-storage-account>.blob.core.windows.net/<container>/<blob-path>?<SAS-token>'
    

    This command uses server-to-server APIs, ensuring data iscopied directly between storage servers
    User's image For more information:
    https://https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-blobs-copy

    2. Azure Data Factory

    Azure Data Factory (ADF) is a powerful tool for orchestrating data movement and transformation. You can create pipelines to automate the copying of data from one blob storage to another. Here’s a high-level overview of the steps:

    1. Create a Data Factory: Set up a new data factory in the Azure portal.
    2. Create Linked Services: Define linked services for your source and destination storage accounts.
    3. Create Datasets: Define datasets for the source and destination blobs.
    4. Create Pipelines: Build pipelines to perform the copy operation. You can use the Copy Data tool in ADF to simplify this process.

    For more information:
    https://learn.microsoft.com/en-us/azure/data-factory/tutorial-bulk-copy-portal

    3.Azure CLI: You can use Azure CLI commands to automate the copying of blobs. The az storage blob copy start command can be used to initiate the copy process. For more information:

    https://techcommunity.microsoft.com/t5/microsoft-developer-community/azure-tips-and-tricks-how-to-move-azure-storage-blobs-between/ba-p/3545304

    These methods should help you efficiently manage bulk data transfers between your DEV and PROD environments.

    If you have any other questions or are still running into more issues, let me know in the "comments" and I would be happy to help you


    Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.