I've got a command line app written in python that runs in a container on my local machine running in Docker Desktop. I use mounts to share my data files with the container and a volume where the container caches its own data. My current procedure is to run the image, shell into the container, run the command-line app on the mounted data, store the output data in the same place, and finally exit/kill the container.
I'd like to run this in a cloud, because I expect that will give me access to bigger faster compute resources than my late 20914 mac mini. So I started by checking out storage. Using the pricing calculator with default values, I'm told this is going to run me about $300 a month ($10/day). I find this surprisingly weird since I think I pay $10/month for 2 TB storage on iCloud.
Granted, I don't really need the default of 1 TiB. The actual data I have on hand is only about 1.5 GB. I see that isn't an option in the calculator. 100 GiB seems to be the floor. Still that comes in at about $160/month which I also find exorbitant.
What am I not understanding? Did I use some defaults for crazy-fast, high-reliability, with nuclear-winter failover? Or is that just what 100 GB of simple file storage cost on Azure for a month?
Edit: re-added all my paragraph breaks that were automatically removed upon posting. And again, but this time using markdown mode.