I have a requirement to run a query against our datamart, create a file, and SFTP the file to a remote server on a regularly scheduled basis using Data Factory. I have developed a Python script that works, set up a data factory, a batch account, a storage account, and a batch pool. I can trigger the task from the pipeline I set up in data factory studio and the job runs successfully. I had the SA who manages the remote server whitelist the external IP of my batch pool VM. The task runs as expect and the file gets transmitted successfully.
The problem I have is that, as I understand it, the IP of my batch pool VM can change anytime the VM is restarted. I have downloaded the Azure Public IP ranges from for my region from Microsoft, but the pool of potential IPs is over 5.2 million and I can't ask the SA to whitelist that many addresses.
How can I either configure my existing batch pool to use a static IP or small range of IPs or create a new batch pool with a static IP or small range of IPs? I'm pretty new to Azure so if someone could provide a step by step that would be really helpful.