Hello
I am planning a solution by leveraging the azure file storage.
User story:
- We need to store/update the file shares on azure through all the regions and those files should be eventually sync the same (like replication), we might provide a tool to let people upload the file and
distribute to all the regions
- We have backend services through all the regions which will access those file shares periodically (every 2 minutes), and download the latest one if the version of file data has updated
- The peak of download requests in theory could be 300 K within 5 minutes in worldwide.
- Each file for download is at most hundreds MBs which requires to be downloaded less than 30seconds
Questions:
- We are concerning about the IOPS or downloads pressure of capacity, so how to establish the architecture in high level by leveraging the azure file storage/share?
- Would domain-based DFS or azure file sync work for both syncing/downloading of this case?
Thanks