Edit

Share via


Tutorial: Run Hello-world DAG in Apache Airflow Job

Note

Apache Airflow job is powered by Apache Airflow.

In this tutorial, you'll run a hello-world DAG in Apache Airflow Job. This tutorial focuses on familiarizing users with the features and environment of the Apache Airflow Job.

Create an Apache Airflow Job

  1. You can use an existing workspace or Create a new workspace.

  2. Expand + New dropdown -> Click on More Options -> Under Data Factory section -> Select Apache Airflow Job (preview)

    Screenshot shows click on more options.

    Screenshot to select Apache Airflow job.

  3. Give a suitable name to your project and click on the "Create" button.

Create a DAG File

  1. Click on "New DAG file" card -> Give the name to the file and Click on "Create" button.

    Screenshot to name the DAG file.

  2. A boilerplate DAG code is presented to you. You can edit the file as per your requirements.

    Screenshot presents boilerplate DAG file in Microsoft Fabric.

  3. Click on "Save icon".

    Screenshot presents how to save DAG file in Microsoft Fabric.

Monitor your Apache Airflow DAG in Apache Airflow UI

  1. The saved dag files are loaded in the Apache Airflow UI. You can monitor them by clicking on the "Monitor in Apache Airflow" button.

    Screenshot to monitor the Airflow DAG.

    Screenshot presents the loaded Airflow DAG.

Quickstart: Create an Apache Airflow Job