Denne browser understøttes ikke længere.
Opgrader til Microsoft Edge for at drage fordel af de nyeste funktioner, sikkerhedsopdateringer og teknisk support.
What is a data pipeline?
A special folder in OneLake storage where data can be exported from a lakehouse
A sequence of activities to orchestrate a data ingestion or transformation process
A saved Power Query
You want to use a pipeline to copy data to a folder with a specified name for each run. What should you do?
Create multiple pipelines - one for each folder name
Use a Dataflow (Gen2)
Add a parameter to the pipeline and use it to specify the folder name for each run
You have previously run a pipeline containing multiple activities. What's the best way to check how long each individual activity took to complete?
Rerun the pipeline and observe the output, timing each activity.
View the run details in the run history.
View the Refreshed value for your lakehouse's default dataset
You must answer all questions before checking your work.
Var denne side nyttig?