Data Archive solution

Raz 20 Reputation points

Old data from an outgoing system needs to be stored in azure.

4tb of data accessed twice a week using a VM.

can anyone suggest storage type to choose, method of access. networking and a overall high level solution? TIA

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,370 questions
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 79,381 Reputation points Microsoft Employee

    @Raz Thanks for the question and using MS Q&A platform.

    For storing and accessing 4TB of data in Azure that is accessed twice a week using a VM, I would recommend the following solution:

    1. Storage Type: Azure Blob Storage is a good option for storing large amounts of data that is accessed infrequently. You can choose to store the data in a hot, cool, or archive tier based on your access patterns and cost requirements. The archive tier is the most cost-effective option for storing data that is accessed only a few times per year.
    2. Method of Access: You can access the archived data using various methods, such as the Azure Portal, Azure Storage Explorer, Azure PowerShell, Azure CLI, or the Azure SDKs. For a VM, you can mount the Blob Storage container as a file share using Azure File Sync or Azure NetApp Files.
    3. Networking: To access the archived data from a VM, you need to ensure that the VM is in the same Azure region as the Blob Storage account and is connected to the same virtual network. You can use Azure Virtual Network (VNet) service endpoints or Azure Private Link to secure the network traffic between the VM and Blob Storage.
    4. Pricing: You can use the Azure pricing calculator to estimate the cost of the solution. The cost of the solution will depend on the size of the data, the number of accesses, and the configuration of the VM.
    5. Overall High-Level Solution: Here's a high-level solution that you can follow:
    • Create an Azure Blob Storage account in the same region as the VM.
    • Create a Blob Storage container in the archive tier.
    • Upload the 4TB of data to the container using the Azure Portal or Azure Storage Explorer.
    • Configure the Blob Storage container for secure access using Azure Private Link or VNet service endpoints.
    • Mount the Blob Storage container as a file share using Azure File Sync or Azure NetApp Files.
    • Access the archived data from the VM using the mounted file share.

    This solution provides a cost-effective and scalable way to store and access the archived data in Azure. You can also use Azure Backup or Azure Site Recovery to create a disaster recovery solution for the archived data.

    Hope this helps. Do let us know if you any further queries.

    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    1 person found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful