Use Data Factory IR for Custom Notebook

Krishnamohan Nadimpalli 396 Reputation points


I have an Azure Functions python app which is called from Azure Data Factory. But this always fails with 500 error, I tried to make changes in VS Code and Deploy but it still fails . I am unable to setup this Azure Function app and make this up and running.

I have decided to convert the complete Azure Function App Python code into a Notebook. Now I am trying to run this Notebook in local which is running perfectly.

So I want to use this Notebook to run in Azure Data Factory which means I need to use Data Factory Compute or IR to run this Notebook. How do I achieve this? What are the options that I have.

I explored option of batch accounts but unable to create because of permissions and all at org level.

Please let me know options to run Local Notebook in Cloud infra like ADF.

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
8,507 questions
{count} votes

1 answer

Sort by: Most helpful
  1. BhargavaGunnam-MSFT 21,356 Reputation points Microsoft Employee

    Hello Krishnamohan Nadimpalli ,

    Welcome to the MS Q&A platform.

    One option I can think of is using Azure Databricks or Synapse notebook.

    Using databricks:

    You will need to create a data bricks workspace, cluster, and then import your notebook into the workspace.

    Install the required python packages into your cluster. Once the packages are installed, you can execute your python code in the notebook via the adf pipeline.

    Using synapse:

    Create an Azure Synapse workspace and create a dedicated SQL pool or serverless SQL pool.

    Launch Azure Synapse Studio and create a new Notebook.

    Import your Python notebook into Synapse Studio.

    Install the necessary Python packages in your Synapse workspace.

    Once the required packages are installed, you can execute your Python notebook via adf notebook activity.

    User's image

    I hope this helps. Please let me know if you have any further questions.

    0 comments No comments