Edit

Share via


Access Apache Airflow Job Logs

Note

Apache Airflow job is powered by Apache Airflow.

This article shows you how to access Apache Airflow job logs through the Apache Airflow Job UI.

Prerequisites

To get started, you must complete the following prerequisites:

Create and Run the Apache Airflow Job

We use default Directed Acyclic Graph (DAG), which is automatically created when a new file is added to Dags folder of Fabric managed storage.

  1. Create a new file in Dags folder. A Dag is initialized consisting of BashOperator that prints "Hello world" in Apache Airflow Job logs.

    Screenshot to create a new Apache Airflow file.

  2. Save and Run the DAG within the Apache Airflow UI.

    Screenshot to save and run the Apache Airflow Dag.

Access logs of Apache Airflow Job

  1. When the DAG runs, its logs are displayed in the results section. Click on Arrow, to expand the results section. Screenshot to access the results section in Apache Airflow job.

  2. Click on "Cluster logs" tab to access logs. Screenshot to click on cluster logs.

  3. Apply filters to choose the Start time and Log type of the logs. Screenshot to apply filters on cluster logs.

  4. Access the logs. Screenshot to see cluster logs.

Quickstart: Create an Apache Airflow Job