ערוך

שתף באמצעות


Import DAGs by using Azure Blob Storage

Note

This feature is in public preview. Workflow Orchestration Manager is powered by Apache Airflow.

This article shows you step-by-step instructions on how to import directed acyclic graphs (DAGs) into Workflow Orchestration Manager by using Azure Blob Storage.

Prerequisites

Blob Storage behind virtual networks isn't supported during the preview. Azure Key Vault configuration in storageLinkedServices isn't supported to import DAGs.

Import DAGs

  1. Copy either Sample Apache Airflow v2.x DAG or Sample Apache Airflow v1.10 DAG based on the Airflow environment that you set up. Paste the content into a new file called tutorial.py.

    Upload the tutorial.py file to Blob Storage. For more information, see Upload a file into a blob.

    Note

    You need to select a directory path from a Blob Storage account that contains folders named dags and plugins to import them into the Airflow environment. Plugins aren't mandatory. You can also have a container named dags and upload all Airflow files within it.

  2. Under the Manage hub, select Apache Airflow. Then hover over the previously created Airflow environment and select Import files to import all DAGs and dependencies into the Airflow environment.

    Screenshot that shows importing files in the Manage hub.

  3. Create a new linked service to the accessible storage account mentioned in the "Prerequisites" section. You can also use an existing one if you already have your own DAGs.

    Screenshot that shows how to create a new linked service.

  4. Use the storage account where you uploaded the DAG. (Check the "Prerequisites" section.) Test the connection and then select Create.

    Screenshot that shows some linked service details.

  5. Browse and select airflow if you're using the sample SAS URL. You can also select the folder that contains the dags folder with DAG files.

    Note

    You can import DAGs and their dependencies through this interface. You need to select a directory path from a Blob Storage account that contains folders named dags and plugins to import those into the Airflow environment. Plugins aren't mandatory.

    Screenshot that shows the Browse storage button on the Import Files screen.

    Screenshot that shows the airflow root folder on Browse.

  6. Select Import to import files.

    Screenshot that shows the Import button on the Import Files screen.

    Screenshot that shows importing DAGs.

Importing DAGs could take a couple of minutes during the preview. You can use the notification center (bell icon in the Data Factory UI) to track import status updates.