Hi , have you checked ADF . Its specifically there for moving data from a to b and it looks like what exactly you are trying here
Daily dump mysql flexible servers databases to storage account blob container
Hi folks !
I have a mysql flexible database server with 2 databases, each database is about 50 Gb.
The aim is to mysqldump daily and copy the dumps file to an azure storage account.
I tried to achieve it by using an azure devops scheduled pipeline using a Self Hosted agent (private mysql server and private storage account).
The azure devops self hosted agent is a container hosted in container apps with the following specs : Number of CPU cores 2 - Memory size (Gi) 4Gi.
The mysqldump is working well.
The issue is that the copy to the storage account is failing at about 90%( i tried with a 9gb mysql dump file):
Maybe is it failing because of azcopy ? or not enough memory on self hosted agent ?
Do you think the best solution would be using an azure virtual machine and configure some cron tasks ?
Does large files will be working ? maybe using an NFS storage account rather than blob ?
Thanks at all for your help and advices !
-
Wuppukonduru Venkata Vivek 90 Reputation points
2024-05-26T06:59:34.4266667+00:00
1 additional answer
Sort by: Most helpful
-
ShaikMaheer-MSFT 38,456 Reputation points Microsoft Employee
2024-05-27T06:25:31.6933333+00:00 Hi Cloudy,
Thank you for posting query in Microsoft Q&A Platform.
Using ADF is good option here. ADF is a service that is designed for such data integration solutions. To know about cost of ADF, check below link.
https://learn.microsoft.com/en-us/azure/data-factory/pricing-conceptsHope this helps. Please let me know if any further queries.
Please consider hitting
Accept Answer
button. Accepted answers help community as well. Thank you.