Azure Batch service charges are based on the following components:
- Compute costs: Charges for VMs you use in your pool and the cost depends on the VM size and the number of VMs.
- Storage costs: Charges for storing files in Azure Storage, including input data, application packages, and output data.
- Data transfer costs: Charges for data transfer between the Azure Batch service and Azure Storage, as well as between different regions.
Regarding your issue with running the Python job, it looks like the problem is with the directory structure itself and the way paths are referenced in your scripts.
You can create a zip file of your run_python
folder, including both the test.py
script and the utils
folder. This will ensure that the directory structure is preserved when the files are uploaded to the Batch node.
Then, upload the zip file to an Azure Storage account that is accessible by your Azure Batch account.
In your Azure Batch task, include a startup script that will unzip the package in the correct location on the Batch node.