Manage data with Delta Lake
Delta Lake is a data management solution in Azure Databricks providing features including ACID transactions, schema enforcement, and time travel ensuring data consistency, integrity, and versioning capabilities.
Learning objectives
In this module, you learn:
- What Delta Lake is
- How to manage ACID transactions using Delta Lake
- How to use schema versioning and time travel in Delta Lake
- How to maintain data integrity with Delta Lake
Prerequisites
Before starting this module, you should know how to use Apache Spark in Azure Databricks. Consider completing the Use Apache Spark in Azure Databricks module before this one.