The following steps show how to create a notebook file in Azure Data Studio:
In Azure Data Studio, connect to your Azure Data Explorer cluster.
Navigate to the Connections pane and under the Servers window, right-click the Kusto database and select New Notebook. You can also go to File > New Notebook.
Select Kusto for the Kernel. Confirm that the Attach to menu is set to the cluster name and database. For this article, we use the help.kusto.windows.net cluster with the Samples database data.
You can save the notebook using the Save or Save as... command from the File menu.
To open a notebook, you can use the Open file... command in the File menu, select Open file on the Welcome page, or use the File: Open command from the command palette.
Change the connection
To change the Kusto connection for a notebook:
Select the Attach to menu from the notebook toolbar and then select Change Connection.
Note
Ensure that the database value is populated. Kusto notebooks require to have the database specified.
Now you can either select a recent connection server or enter new connection details to connect.
Note
Specify the cluster name without the https://.
Run a code cell
You can create cells containing KQL queries that you can run in place by selecting the Run cell button to the cell's left. The results are shown in the notebook after the cell runs.
For example:
Add a new code cell by selecting the +Code command in the toolbar.
Copy and paste the following example into the cell and select Run cell. This example queries the StormEvents data for a specific event type.
Kusto
StormEvents
| where EventType == "Waterspout"
Save the result or show chart
If you run a script that returns a result, you can save that result in different formats using the toolbar displayed above the result.
Save As CSV
Save As Excel
Save As JSON
Save As XML
Show Chart
Kusto
StormEvents
| limit10
Provide feedback
You can file a feature request to provide feedback to the product team.
You can file a bug to provide feedback to the product team.
Manage data ingestion and preparation, model training and deployment, and machine learning solution monitoring with Python, Azure Machine Learning and MLflow.