This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
What is a data pipeline?
A special folder in OneLake storage where data can be exported from a lakehouse
A sequence of activities to orchestrate a data ingestion or transformation process
A saved Power Query
You want to use a pipeline to copy data to a folder with a specified name for each run. What should you do?
Create multiple pipelines - one for each folder name
Use a Dataflow (Gen2)
Add a parameter to the pipeline and use it to specify the folder name for each run
You have previously run a pipeline containing multiple activities. What's the best way to check how long each individual activity took to complete?
Rerun the pipeline and observe the output, timing each activity.
View the run details in the run history.
View the Refreshed value for your lakehouse's default dataset
You must answer all questions before checking your work.
Was this page helpful?