Events
31 Mar, 23 - 2 Apr, 23
The biggest Fabric, Power BI, and SQL learning event. March 31 – April 2. Use code FABINSIDER to save $400.
Register todayThis browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
In this tutorial, you build a data pipeline to move a Sample dataset to the Data Warehouse. This experience shows you a quick demo about how to use pipeline copy activity and how to load data into Data Warehouse.
To get started, you must complete the following prerequisites:
Navigate to Power BI.
Select the Power BI icon in the bottom left of the screen, then select Data factory to open homepage of Data Factory.
Navigate to your Microsoft Fabric workspace. If you created a new workspace in the prior Prerequisites section, use this one.
Select Data pipeline and then input a pipeline name to create a new pipeline.
In this session, you start to build your pipeline by following below steps about copying from a sample dataset provided by pipeline into Data Warehouse.
After selecting Copy data on the canvas, the Copy assistant tool will be opened to get started.
Choose the COVID-19 Data Lake from the Sample data options for your data source, and then select Next.
In the Connect to data source section of the Copy data assistant, a preview of the sample data Bing COVID-19 is displayed. Select Next to move on to the data destination.
Select the Workspace tab and choose Data warehouse. Then select Next.
Select your Data Warehouse from the drop-down list, then select Next.
Configure and map your source data to the destination Data Warehouse table by entering Destination table name, then select Next one more time.
Configure other settings on Settings page. In this tutorial, select Next directly since you don't need to use staging and copy command.
Review your copy activity settings in the previous steps and select OK to finish. Or you can revisit the previous steps in the tool to edit your settings, if needed.
The Copy activity is added to your new data pipeline canvas. All settings including advanced settings for the activity are available in the tabs below the pipeline canvas when the created Copy data activity is selected.
Switch to the Home tab and select Run. A confirmation dialog is displayed. Then select Save and run to start the activity.
You can monitor the running process and check the results on the Output tab below the pipeline canvas. Select the run details button (with the glasses icon highlighted) to view the run details.
The run details show how much data was read and written and various other details about the run.
You can also schedule the pipeline to run with a specific frequency as required. Below is an example scheduling the pipeline to run every 15 minutes. You can also specify the Start time and End time for your schedule. If you don't specify a start time, the start time is the time your schedule applies. If you don't specify an end time, your pipeline run will keep recurring every 15 minutes.
This sample shows you how to load sample data into a Data Warehouse using Data Factory in Microsoft Fabric. You learned how to:
Next, advance to learn more about monitoring your pipeline runs.
Events
31 Mar, 23 - 2 Apr, 23
The biggest Fabric, Power BI, and SQL learning event. March 31 – April 2. Use code FABINSIDER to save $400.
Register todayTraining
Module
Load data into a Microsoft Fabric data warehouse - Training
Explore the process of loading data into a warehouse in Microsoft Fabric.
Certification
Microsoft Certified: Fabric Data Engineer Associate - Certifications
As a Fabric Data Engineer, you should have subject matter expertise with data loading patterns, data architectures, and orchestration processes.