Tutorial: Analyze data with a notebook

Applies to: ✅ SQL analytics endpoint and Warehouse in Microsoft Fabric

In this tutorial, learn about how you can use analyze data using T-SQL notebook or using a notebook with a Lakehouse shortcut.

Option 1: Create a T-SQL notebook on the warehouse

To get started, create a T-SQL notebook in one of the following two ways:

  1. Create a T-SQL notebook from the Microsoft Fabric Warehouse homepage. Navigate to the Data Warehouse workload, and choose Notebook.

  2. Select + Warehouses and add the WideWorldImporters warehouse. Select the WideWorldImporters warehouse from OneLake data hub dialog box.

    Screenshot from the Fabric portal of the Add Warehouses button underneath Warehouses in the All sources area of the Explorer.

  3. Create a T-SQL notebook from the warehouse editor. From your WideWorldImporters warehouse, from the top navigation ribbon, select New SQL query and then New SQL query in notebook.

    Screenshot from the Fabric portal of the New SQL query in notebook menu option.

  4. Once the notebook is created, you can see WideWorldImporters warehouse is loaded into the explorer, and the ribbon shows T-SQL as the default language.

  5. Right-click to launch the More menu option on the dimension_city table. Select SELECT TOP 100 to generate a quick SQL template to explore 100 rows from the table.

    Screenshot from the Fabric portal of the SELECT TOP 100 rows option.

  6. Run the code cell and you can see messages and results.

    Screenshot from the Fabric portal of the SELECT TOP 100 results.

Option 2: Create a lakehouse shortcut and analyze data with an notebook

First, we create a new lakehouse. To create a new lakehouse in your Microsoft Fabric workspace:

  1. Select the Data Warehouse Tutorial workspace in the navigation menu.

  2. Select + New > Lakehouse.

    Screenshot from the Fabric portal showing the + New menu. Lakehouse is boxed in red.

  3. In the Name field, enter ShortcutExercise, and select Create.

  4. The new lakehouse loads and the Explorer view opens up, with the Get data in your lakehouse menu. Under Load data in your lakehouse, select the New shortcut button.

    Screenshot from the Fabric portal showing the Load data in your lakehouse menu on the landing page. The New shortcut button is boxed in red.

  5. In the New shortcut window, select the button for Microsoft OneLake.

    Screenshot from the Fabric portal showing the New shortcut window. The button for Microsoft OneLake is boxed in red.

  6. In the Select a data source type window, scroll through the list until you find the Warehouse named WideWorldImporters you created previously. Select it, then select Next.

  7. In the OneLake object browser, expand Tables, expand the dbo schema, and then select the checkbox for dimension_customer. Select Next. Select Create.

  8. If you see a folder called Unidentified under Tables, select the Refresh icon in the horizontal menu bar.

    Screenshot from the Fabric portal showing the refresh button on the horizontal menu bar, and the Unidentified tables under ShortcutExercise in the Lakehouse explorer.

  9. Select the dimension_customer in the Table list to preview the data. The lakehouse is showing the data from the dimension_customer table from the Warehouse!

    Screenshot from the Fabric portal showing the data preview of the dimension_customer table.

  10. Next, create a new notebook to query the dimension_customer table. In the Home ribbon, select the dropdown list for Open notebook and choose New notebook.

  11. In the Explorer, select the Lakehouses source folder.

  12. Select, then drag the dimension_customer from the Tables list into the open notebook cell. You can see a PySpark query has been written for you to query all the data from ShortcutExercise.dimension_customer. This notebook experience is similar to Visual Studio Code Jupyter notebook experience. You can also open the notebook in VS Code.

    Screenshot from the Fabric portal notebook view. An arrow indicates the path to select dimension_customer, then drag and drop it into the open notebook cell.

  13. In the Home ribbon, select the Run all button. Once the query is completed, you will see you can easily use PySpark to query the Warehouse tables!

    Screenshot from the Fabric portal showing the results of running the notebook to display data from dimension_customer.

Next step