Ingest data with Spark and Microsoft Fabric notebooks

Intermediate
Data Analyst
Data Engineer
Data Scientist
Microsoft Fabric

Discover how to use Apache Spark and Python for data ingestion into a Microsoft Fabric lakehouse. Fabric notebooks provide a scalable and systematic solution.

Learning objectives

By the end of this module, you’ll be able to:

  • Ingest external data to Fabric lakehouses using Spark

  • Configure external source authentication and optimization

  • Load data into lakehouse as files or as Delta tables

Prerequisites

  • Experience with Apache Spark and Python

  • Basic understanding of extracting, transforming, and loading data