Events
31 Mar, 23 - 2 Apr, 23
The biggest Fabric, Power BI, and SQL learning event. March 31 – April 2. Use code FABINSIDER to save $400.
Register todayThis browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
This tutorial describes the steps to move data into Lakehouse.
Two approaches are provided using the copy assistant:
To get started, you must complete the following prerequisites:
Follow these steps to set up your copy activity.
Open an existing data pipeline or create a new data pipeline.
Select Copy data assistant on the canvas to open the wizard and get started. Or select Use copy assistant from the Copy data drop down list under Activities tab on the ribbon.
Choose your data source by choosing a data source type. In this tutorial, we'll use Azure SQL Database as an example. Search on the Choose data source screen to find and select Azure SQL Database.
Create a connection to your data source by filling in the required connection information on the panel.
After you fill in the required connection information on the panel, select Next.
If you didn't already select a database initially, a list of databases is presented for you to select from.
Select the table(s) that is to be moved. Then, select Next.
Choose Lakehouse as your destination and then select Next.
Enter a Lakehouse name, then select Create and connect.
Configure and map your source data to the destination Lakehouse table. Select Tables for the Root folder and Load to a new table for Load settings. Provide a Table name and select Next.
Review your configuration, and uncheck the Start data transfer immediately checkbox. Then select Next to finish the assistant experience.
Select Run from the Home toolbar and then select Save and run when prompted.
For each activity that was run, you can select the activity's corresponding link in the Output tab after the pipeline runs to view the details the activity. In this case we have 2 individual copy activities that ran - one for each table copied from SQL Azure to the Lakehouse. When you select the activity's details link, you can see how mmuch data was read and written and how much space the data consumed in the source and destination, as well as throughput speed and other details.
Go to your Lakehouse and refresh your Lake view to see the latest data ingested.
Switch to Table view to view the data in table.
Note
Currently data lands into Lakehouse Tables folders (a managed area) in Delta format only. Those files will be automatically registered as a table and be visible under Table view from Lakehouse portal. Only the first layer folders under Tables will be registered as delta table. Browsing or Previewing from Lakehouse Table isn't supported yet. Data that gets loaded into the same table will be appended. Delete or Update to tables isn't supported yet.
This sample shows you how to move data from Azure SQL DB into Lakehouse with the Copy Assistant in Data Factory for Microsoft Fabric. You learned how to:
Next, advance to learn more about monitoring your pipeline runs.
Events
31 Mar, 23 - 2 Apr, 23
The biggest Fabric, Power BI, and SQL learning event. March 31 – April 2. Use code FABINSIDER to save $400.
Register todayTraining
Module
Build data pipelines with Delta Live Tables - Training
Learn how to build data pipelines with Delta Live Tables in Azure Databricks
Certification
Microsoft Certified: Azure Data Engineer Associate - Certifications
Demonstrate understanding of common data engineering tasks to implement and manage data engineering workloads on Microsoft Azure, using a number of Azure services.