Hi Utsav Mori,
I would suggest you to start with the quick starter on Data Factory (https://learn.microsoft.com/en-us/azure/data-factory/quickstart-get-started) then you can set up a pipeline base on your requeriments . It would look like so:
- Create an Azure Batch Account and Pool.
- Upload your Python script to Azure Blob Storage.
- Create a Data Factory and a new pipeline in it.
- Add a Custom Activity in the pipeline to run your Python script.
- Define the input and output datasets linked to your Blob Storage.
- Validate and publish the pipeline.
- Create a schedule trigger for automation.
- Monitor the pipeline’s performance.
Additional Microsoft how tos:
- https://learn.microsoft.com/en-us/azure/data-factory/how-to-create-schedule-trigger?tabs=data-factory
- https://learn.microsoft.com/en-us/azure/data-factory/tutorial-copy-data-portal
If the information helped address your question, please Accept the answer.
Luis