Petabyte-scale ingestion with Azure Data Factory or Azure Synapse Pipeline
In this module, you will learn the various methods that can be used to ingest data between various data stores using Azure Data Factory.
Learning objectives
- Introduction
- List the data factory ingestion methods
- Describe data factory connectors
- Exercise: Use the data factory copy activity
- Exercise: Manage the self hosted integration runtime
- Exercise: Set up the Azure integration runtime
- Understand data ingestion security considerations
- Knowledge check
- Summary
Prerequisites
The student should be able to:
- Log in to the Azure portal
- Explain and create resource groups
- Describe Azure Data Factory and its core components