Running a Python Script on a Server from Azure Data Factory

Babs 20 Reputation points
2025-07-28T22:05:00.39+00:00

I am developing an ETL solution in azure data factory. I have a python script that does some processing located on a server.

My orchestrator is azure data factory. I don't have databricks for cost reasons.

I'd like to call the python script from the server to run it on my azure data factory solution.

Do you have an idea for this?

Thanks in advance.

My orchestrator is azure data factory. I don't have any databricks since it's just two python scripts that need to be called by azure data factory.

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
{count} votes

Answer accepted by question author
  1. Kalyani Kondavaradala 4,600 Reputation points Microsoft External Staff Moderator
    2025-07-29T06:51:27.27+00:00

    Hello Babs,

    Thanks for reaching out on Microsoft Q&A!

     I understand that you want to run Python scripts through Azure Data Factory (ADF). Below are the steps you can follow to get this working:

    • Ensure you have an active Azure subscription This is required to create and use Azure services like Storage, Batch, and ADF.
    • Create a Storage Account and a Batch Account
      • Sign in to the Azure Portal.
      • Navigate to "Create a resource", then search and create both:
        • Storage Account (where your Python scripts can be uploaded)
        • An Azure Batch Account (used to run your scripts on virtual machines)
    • Set up your ADF pipeline
      • Go to Azure Data Factory Studio
      • Create a new pipeline
      • In the Activities pane, expand "Batch Service", and drag the Custom Activity onto the pipeline canvas image
    • Select the Azure Batch tab, and then select New.
    • Complete the New linked service form as follows:
      • Name: Enter a name for the linked service, such as AzureBatch1.
      • Access key: Enter the primary access key you copied from your Batch account.
      • Account name: Enter your Batch account name.
      • Batch URL: Enter the account endpoint you copied from your Batch account, such as https://batchdotnet.eastus.batch.azure.com
      • Pool name: Enter custom-activity-pool, the pool you created in Batch Explorer.
      • Storage account linked service name: Select New. On the next screen, enter a Name for the linked storage service, such as AzureBlobStorage1, select your Azure subscription and linked storage account, and then select Create.
    • Select the Settings tab, and enter the following settings:
    • Command: Enter cmd /C python <<Your file name>>.py.
    • Resource linked service: Select the linked storage service you created, such as AzureBlobStorage1, and test the connection to make sure it's successful.
    • Folder path: Select the folder icon, and then select the input container and select OK. The files from this folder download from the container to the pool nodes before the Python script runs. image
    • Once pipeline established successfully, please validate and debug the pipeline

    Please refer this Microsoft documentation for more clarity:

    Tutorial: Run a Batch job through Azure Data Factory - Azure Batch | Microsoft Learn

     YouTube Url: How to create etl/python pipeline in azure data factory | Azure Data Factory  Execute Python script

    Let me know if you require any additional information from our end. If these answers your query, do click the "Upvote" and click "Accept the answer" of which might be beneficial to other community members reading this thread.

     Thanks,

    Kalyani

    1 person found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.