Best way to transfer large data from Azure Storage to Azure VM — is FTP recommended?

Osama AlGhamdi 0 Reputation points
2025-04-30T07:38:20.7366667+00:00

Hi,

I have a database server (VM) hosted on Azure, and I need to transfer a large data from the backup to my computer.

Many people have suggested using the FTP protocol for the transfer, but I'm not sure about the best way to implement this.

Could you please advise on the recommended approach and how to do it properly?

Thanks.

Azure Data Share
Azure Data Share
An Azure service that is used to share data from multiple sources with other organizations.
63 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Abiola Akinbade 27,530 Reputation points Moderator
    2025-04-30T07:50:08.1433333+00:00

    No, this is not recommended. FTP lacks encryption by default, and it doesn’t scale well for large transfers or automation.

    Have you considered azCopy? https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10?tabs=dnf

    It's built to handle large files and large numbers of files efficiently. With Azcopy you can use SAS tokens or entra for authentication

    If the large backup file is only on the VM's attached disk and not already in Azure Storage, then azcopy is not the direct tool to get it from the VM disk to your local computer. In this case, you would use standard file transfer methods from the VM maybe a fileshare first then you can then use azCopy

    You can mark it 'Accept Answer' and 'Upvote' if this helped you

    Regards,

    Abiola


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.