Събитие
31.03, 23 ч. - 2.04, 23 ч.
Най-голямото събитие за обучение на Fabric, Power BI и SQL. 31 март – 2 април. Използвайте код FABINSIDER, за да спестите $400.
Регистрирайте се днесТози браузър вече не се поддържа.
Надстройте до Microsoft Edge, за да се възползвате от най-новите функции, актуализации на защитата и техническа поддръжка.
Бележка
Apache Airflow job is powered by Apache Airflow.
Apache Airflow is an open-source platform used to programmatically create, schedule, and monitor complex jobs. It allows you to define a set of tasks, called operators, that can be combined into directed acyclic graphs (DAGs) to represent data pipelines.
Apache Airflow Job provides a simple and efficient way to create and manage Apache Airflow environments, enabling you to run your orchestration jobs at scale with ease. In this quickstart, let's create a simple Apache Airflow job to familiarize yourself with the environment and functionalities of Apache Airflow Job.
Бележка
Since Apache Airflow job is in preview state, you need to enable it through your tenant admin. If you already see Apache Airflow Job, your tenant admin may have already enabled it.
You can use an existing workspace or Create a new workspace.
Expand + New
dropdown -> Click on More Options -> Under Data Factory
section -> Select Apache Airflow Job (preview)
Give a suitable name to your project and click on the "Create" button.
Click on "New DAG file" card -> give the name to the file and Click on "Create" button.
A boilerplate DAG code is presented to you. You can edit the file as per your requirements.
Click on "Save icon".
Begin by clicking on the "Run DAG" button.
Once initiated, a notification will promptly appear indicating the DAG is running.
To monitor the progress of the DAG run, simply click on "View Details" within the notification center. This action will redirect you to the Apache Airflow UI, where you can conveniently track the status and details of the DAG run.
The saved dag files are loaded in the Apache Airflow UI. You can monitor them by clicking on the "Monitor in Apache Airflow" button.
Събитие
31.03, 23 ч. - 2.04, 23 ч.
Най-голямото събитие за обучение на Fabric, Power BI и SQL. 31 март – 2 април. Използвайте код FABINSIDER, за да спестите $400.
Регистрирайте се днесОбучение
Модул
Automate workloads with Azure Databricks Jobs - Training
Automate workloads with Azure Databricks Jobs
Сертифициране
Microsoft Certified: Azure Data Engineer Associate - Certifications
Demonstrate understanding of common data engineering tasks to implement and manage data engineering workloads on Microsoft Azure, using a number of Azure services.
Документация
What is Apache Airflow job? - Microsoft Fabric
Learn about when to use Apache Airflow job, basic concepts, and supported regions.
Run a Fabric data pipeline and notebook using Apache Airflow DAG. - Microsoft Fabric
Learn to run Microsoft Fabric data pipelines and notebooks using Apache Airflow DAG.
Create your first data pipeline to copy data - Microsoft Fabric
Learn how to build and schedule a new data pipeline to copy sample data to a Lakehouse.