Exercise - Implement Lakeflow Jobs with Azure Databricks

Completed

Now it's your chance to implement Lakeflow Jobs with Azure Databricks. In this lab, you configure and automate a CDR data pipeline for TelConnect using Lakeflow Jobs. You run a prebuilt parameterized notebook that processes Call Detail Records through bronze, silver, and gold layers, then configure a Lakeflow Job in the Azure Databricks UI with task dependencies, a job parameter, scheduled and event-based triggers, failure notifications, and retry policies.

Note

To complete this lab, you need an Azure subscription in which you have administrative access.

Launch the exercise and follow the instructions.

Button to launch exercise.