Python plugin
The Python plugin runs a user-defined function (UDF) using a Python script. The Python script gets tabular data as its input, and produces tabular output. The plugin's runtime is hosted in sandboxes, running on the cluster's nodes.
Syntax
T |
evaluate
[hint.distribution
=
(single
| per_node
)] python(
output_schema,
script [,
script_parameters][,
external_artifacts][,
spill_to_disk])
Arguments
- output_schema: A
type
literal that defines the output schema of the tabular data, returned by the Python code.- The format is:
typeof(
ColumnName:
ColumnType[, ...])
. For example,typeof(col1:string, col2:long)
. - To extend the input schema, use the following syntax:
typeof(*, col1:string, col2:long)
- The format is:
- script: A
string
literal that is a valid Python script to execute. To generate multi-line strings, see Usage tips. - script_parameters: An optional
dynamic
literal. It's a property bag of name/value pairs to be passed to the Python script as the reservedkargs
dictionary. For more information, see Reserved Python variables. - hint.distribution: An optional hint for the plugin's execution to be distributed across multiple cluster nodes.
- The default value is
single
. single
: A single instance of the script will run over the entire query data.per_node
: If the query before the Python block is distributed, an instance of the script will run on each node, on the data that it contains.
- The default value is
- external_artifacts: An optional
dynamic
literal that is a property bag of name and URL pairs, for artifacts that are accessible from cloud storage. See more here. - spill_to_disk: An optional
boolean
literal specifying an alternative method for serializing the input table to the Python sandbox. For serializing big tables set it totrue
to speed up the serialization and significantly reduce the sandbox memory consumption. Default isfalse
as this parameter is experimental.
Reserved Python variables
The following variables are reserved for interaction between Kusto Query Language and the Python code.
df
: The input tabular data (the values ofT
above), as apandas
DataFrame.kargs
: The value of the script_parameters argument, as a Python dictionary.result
: Apandas
DataFrame created by the Python script, whose value becomes the tabular data that gets sent to the Kusto query operator that follows the plugin.
Enable the plugin
- The plugin is disabled by default.
- To enable the plugin, see the list of prerequisites.
- Enable or disable the plugin in the Azure portal, in your cluster's Configuration tab.
Python sandbox image
The Python sandbox image is based on Anaconda 5.2.0 distribution with the Python 3.6 engine. See the list of Anaconda packages.
Note
A small percentage of packages might be incompatible with the limitations enforced by the sandbox where the plugin is run.
The Python image also contains common ML packages:
tensorflow
,keras
,torch
,hdbscan
,xgboost
, and other useful packages.The plugin imports numpy (as
np
) & pandas (aspd
) by default. You can import other modules as needed.
Use Ingestion from query and update policy
- Use the plugin in queries that are:
- Defined as part of an update policy, whose source table is ingested to using non-streaming ingestion.
- Run as part of a command that ingests from a query, such as
.set-or-append
. In both these cases, verify that the volume and frequency of the ingestion, and the complexity and resources used by the Python logic, align with sandbox parameters and the cluster's available resources. Failure to do so may result in throttling errors.
- You can't use the plugin in a query that is defined as part of an update policy, whose source table is ingested using streaming ingestion.
Examples
range x from 1 to 360 step 1
| evaluate python(
//
typeof(*, fx:double), // Output schema: append a new fx column to original table
```
result = df
n = df.shape[0]
g = kargs["gain"]
f = kargs["cycles"]
result["fx"] = g * np.sin(df["x"]/n*2*np.pi*f)
```
, bag_pack('gain', 100, 'cycles', 4) // dictionary of parameters
)
| render linechart
print "This is an example for using 'external_artifacts'"
| evaluate python(
typeof(File:string, Size:string), ```if 1:
import os
result = pd.DataFrame(columns=['File','Size'])
sizes = []
path = '.\\\\Temp'
files = os.listdir(path)
result['File']=files
for file in files:
sizes.append(os.path.getsize(path + '\\\\' + file))
result['Size'] = sizes
```,
external_artifacts =
dynamic({"this_is_my_first_file":"https://kustoscriptsamples.blob.core.windows.net/samples/R/sample_script.r",
"this_is_a_script":"https://kustoscriptsamples.blob.core.windows.net/samples/python/sample_script.py"})
)
File | Size |
---|---|
this_is_a_script | 120 |
this_is_my_first_file | 105 |
Performance tips
- Reduce the plugin's input data set to the minimum amount required (columns/rows).
- Use filters on the source data set, when possible, with Kusto's query language.
- To do a calculation on a subset of the source columns, project only those columns before invoking the plugin.
- Use
hint.distribution = per_node
whenever the logic in your script is distributable.- You can also use the partition operator for partitioning the input data set.
- Use Kusto's query language whenever possible, to implement the logic of your Python script.
Usage tips
- To generate multi-line strings containing the Python script in
Kusto.Explorer
, copy your Python script from your favorite Python editor (Jupyter, Visual Studio Code, PyCharm, and so on). Now do one of:- Enclose the full script between lines containing three consecutive backticks, for example:
```
python code
```
- Press F2 to open the Edit in Python window. Paste the script into this window. Select OK. The script will be decorated with quotes and new lines, so it's valid in Kusto, and automatically pasted into the query tab.
- Paste the Python code directly into the query tab. Select those lines, and press Ctrl+K, Ctrl+S hot keys, to decorate them as above. To reverse, press Ctrl+K, Ctrl+M hot keys. See the full list of Query Editor shortcuts.
- Enclose the full script between lines containing three consecutive backticks, for example:
- To avoid conflicts between Kusto string delimiters and Python string literals, use:
- Single quote characters (
'
) for Kusto string literals in Kusto queries - Double quote characters (
"
) for Python string literals in Python scripts
- Single quote characters (
- Use the
externaldata
operator to obtain the content of a script that you've stored in an external location, such as Azure Blob storage.
Example
let script =
externaldata(script:string)
[h'https://kustoscriptsamples.blob.core.windows.net/samples/python/sample_script.py']
with(format = raw);
range x from 1 to 360 step 1
| evaluate python(
typeof(*, fx:double),
toscalar(script),
bag_pack('gain', 100, 'cycles', 4))
| render linechart
Using External Artifacts
External artifacts from cloud storage can be made available for the script and used at runtime.
The URLs referenced by the external artifacts property must be:
- Included in the cluster's callout policy.
- In a publicly available location, or provide the necessary credentials, as explained in storage connection strings.
Note
When authenticating external artifacts using Managed Identities, the SandboxArtifacts
usage must be defined on the cluster level managed identity policy.
The artifacts are made available for the script to consume from a local temporary directory, .\Temp
. The names provided in the property bag are used as the local file names. See Examples.
For information regarding referencing external packages, see Install packages for the Python plugin.
Install packages for the Python plugin
You may need to install package(s) yourself, for the following reasons:
- The package is private and is your own.
- The package is public but isn't included in the plugin's base image.
Install packages as follows:
Prerequisites
Create a blob container to host the packages, preferably in the same place as your cluster. For example,
https://artifactswestus.blob.core.windows.net/python
, assuming your cluster is in West US.Alter the cluster's callout policy to allow access to that location.
This change requires AllDatabasesAdmin permissions.
For example, to enable access to a blob located in
https://artifactswestus.blob.core.windows.net/python
, run the following command:
.alter-merge cluster policy callout @'[ { "CalloutType": "sandbox_artifacts", "CalloutUriRegex": "artifactswestus\\.blob\\.core\\.windows\\.net/python/","CanCall": true } ]'
Install packages
For public packages in PyPi or other channels, download the package and its dependencies.
- From a cmd window in your local Windows Python environment, run:
pip wheel [-w download-dir] package-name.
Create a zip file that contains the required package and its dependencies.
- For private packages, zip the folder of the package and the folders of its dependencies.
- For public packages, zip the files that were downloaded in the previous step.
Note
- Make sure to download the package that is compatible to the Python engine and the platform of the sandbox runtime (currently 3.6.5 on Windows)
> * Make sure to zip the
.whl
files themselves, and not their parent folder. > * You can skip.whl
files for packages that already exist with the same version in the base sandbox image.
Upload the zipped file to a blob in the artifacts location (from step 1).
Call the
python
plugin.- Specify the
external_artifacts
parameter with a property bag of name and reference to the zip file (the blob's URL, including a SAS token). - In your inline python code, import
Zipackage
fromsandbox_utils
and call itsinstall()
method with the name of the zip file.
- Specify the
Example
Install the Faker package that generates fake data.
range ID from 1 to 3 step 1
| extend Name=''
| evaluate python(typeof(*), ```if 1:
from sandbox_utils import Zipackage
Zipackage.install("Faker.zip")
from faker import Faker
fake = Faker()
result = df
for i in range(df.shape[0]):
result.loc[i, "Name"] = fake.name()
```,
external_artifacts=bag_pack('faker.zip', 'https://artifacts.blob.core.windows.net/kusto/Faker.zip?*** REPLACE WITH YOUR SAS TOKEN ***'))
ID | Name |
---|---|
1 | Gary Tapia |
2 | Emma Evans |
3 | Ashley Bowen |
This capability isn't supported in Azure Monitor