Ingest data into a Databricks lakehouse
Azure Databricks offers various ways to ingest data from various sources into a lakehouse backed by Delta Lake. This article lists data sources and provides links to steps for ingesting data from each source type.
Cloud object storage
To learn about how to configure incremental ingestion from cloud object storage, see Ingest data from cloud object storage.
LakeFlow Connect
Databricks LakeFlow Connect offers native connectors for ingestion from enterprise applications and databases. The resulting ingestion pipeline is governed by Unity Catalog and is powered by serverless compute and Delta Live Tables.
LakeFlow Connect leverages efficient incremental reads and writes to make data ingestion faster, more scalable, and more cost-efficient, while your data remains fresh for downstream consumption.
Streaming sources
Azure Databricks can integrate with stream messaging services for near-real time data ingestion into a lakehouse. See Streaming and incremental ingestion.
Local data files
You can securely upload local data files or download files from a public URL. See Upload files to Azure Databricks.
Migrate data to Delta Lake
To learn how to migrate existing data to Delta Lake, see Migrate data to Delta Lake.