Work with models in Azure Machine Learning
APPLIES TO:
Azure CLI ml extension v2 (current)
Python SDK azure-ai-ml v2 (current)
Azure Machine Learning allows you to work with different types of models. In this article, you learn about using Azure Machine Learning to work with different model types, such as custom, MLflow, and Triton. You also learn how to register a model from different locations, and how to use the Azure Machine Learning SDK, the user interface (UI), and the Azure Machine Learning CLI to manage your models.
Tip
If you have model assets created that use the SDK/CLI v1, you can still use those with SDK/CLI v2. Full backward compatibility is provided. All models registered with the V1 SDK are assigned the type custom
.
Prerequisites
- An Azure subscription. If you don't have an Azure subscription, create a free account before you begin. Try the free or paid version of Azure Machine Learning.
- An Azure Machine Learning workspace.
- The Azure Machine Learning SDK v2 for Python.
- The Azure Machine Learning CLI v2.
Additionally, you will need to:
- Install the Azure CLI and the ml extension to the Azure CLI. For more information, see Install, set up, and use the CLI (v2).
Supported paths
When you provide a model you want to register, you'll need to specify a path
parameter that points to the data or job location. Below is a table that shows the different data locations supported in Azure Machine Learning and examples for the path
parameter:
Location | Examples |
---|---|
A path on your local computer | mlflow-model/model.pkl |
A path on an Azure Machine Learning Datastore | azureml://datastores/<datastore-name>/paths/<path_on_datastore> |
A path from an Azure Machine Learning job | azureml://jobs/<job-name>/outputs/<output-name>/paths/<path-to-model-relative-to-the-named-output-location> |
A path from an MLflow job | runs:/<run-id>/<path-to-model-relative-to-the-root-of-the-artifact-location> |
A path from a Model Asset in Azure Machine Learning Workspace | azureml:<model-name>:<version> |
A path from a Model Asset in Azure Machine Learning Registry | azureml://registries/<registry-name>/models/<model-name>/versions/<version> |
Supported modes
When you run a job with model inputs/outputs, you can specify the mode - for example, whether you would like the model to be read-only mounted or downloaded to the compute target. The table below shows the possible modes for different type/mode/input/output combinations:
Type | Input/Output | upload |
download |
ro_mount |
rw_mount |
direct |
---|---|---|---|---|---|---|
custom file |
Input | |||||
custom folder |
Input | ✓ | ✓ | ✓ | ||
mlflow |
Input | ✓ | ✓ | |||
custom file |
Output | ✓ | ✓ | ✓ | ||
custom folder |
Output | ✓ | ✓ | ✓ | ||
mlflow |
Output | ✓ | ✓ | ✓ |
Follow along in Jupyter Notebooks
You can follow along this sample in a Jupyter Notebook. In the azureml-examples repository, open the notebook: model.ipynb.
Create a model in the model registry
Model registration allows you to store and version your models in the Azure cloud, in your workspace. The model registry helps you organize and keep track of your trained models.
The code snippets in this section cover how to:
- Register your model as an asset in Machine Learning by using the CLI.
- Register your model as an asset in Machine Learning by using the SDK.
- Register your model as an asset in Machine Learning by using the UI.
These snippets use custom
and mlflow
.
custom
is a type that refers to a model file or folder trained with a custom standard not currently supported by Azure Machine Learning.mlflow
is a type that refers to a model trained with mlflow. MLflow trained models are in a folder that contains the MLmodel file, the model file, the conda dependencies file, and the requirements.txt file.
Connect to your workspace
First, let's connect to Azure Machine Learning workspace where we are going to work on.
az account set --subscription <subscription>
az configure --defaults workspace=<workspace> group=<resource-group> location=<location>
Register your model as an asset in Machine Learning by using the CLI
Use the following tabs to select where your model is located.
$schema: https://azuremlschemas.azureedge.net/latest/model.schema.json
name: local-file-example
path: mlflow-model/model.pkl
description: Model created from local file.
az ml model create -f <file-name>.yml
For a complete example, see the model YAML.
Register your model as an asset in Machine Learning by using the SDK
Use the following tabs to select where your model is located.
from azure.ai.ml.entities import Model
from azure.ai.ml.constants import AssetTypes
file_model = Model(
path="mlflow-model/model.pkl",
type=AssetTypes.CUSTOM_MODEL,
name="local-file-example",
description="Model created from local file.",
)
ml_client.models.create_or_update(file_model)
Register your model as an asset in Machine Learning by using the UI
To create a model in Machine Learning, from the UI, open the Models page. Select Register model, and select where your model is located. Fill out the required fields, and then select Register.
Manage models
The SDK and CLI (v2) also allow you to manage the lifecycle of your Azure Machine Learning model assets.
List
List all the models in your workspace:
az ml model list
List all the model versions under a given name:
az ml model list --name run-model-example
Show
Get the details of a specific model:
az ml model show --name run-model-example --version 1
Update
Update mutable properties of a specific model:
az ml model update --name run-model-example --version 1 --set description="This is an updated description." --set tags.stage="Prod"
Important
For model, only description
and tags
can be updated. All other properties are immutable; if you need to change any of those properties you should create a new version of the model.
Archive
Archiving a model will hide it by default from list queries (az ml model list
). You can still continue to reference and use an archived model in your workflows. You can archive either all versions of a model or only a specific version.
If you don't specify a version, all versions of the model under that given name will be archived. If you create a new model version under an archived model container, that new version will automatically be set as archived as well.
Archive all versions of a model:
az ml model archive --name run-model-example
Archive a specific model version:
az ml model archive --name run-model-example --version 1
Use model for training
The SDK and CLI (v2) also allow you to use a model in a training job as an input or output.
Use model as input in a job
Create a job specification YAML file (<file-name>.yml
). Specify in the inputs
section of the job:
- The
type
; whether the model is amlflow_model
,custom_model
ortriton_model
. - The
path
of where your data is located; can be any of the paths outlined in the Supported Paths section.
$schema: https://azuremlschemas.azureedge.net/latest/commandJob.schema.json
# Possible Paths for models:
# AzureML Datastore: azureml://datastores/<datastore-name>/paths/<path_on_datastore>
# MLflow run: runs:/<run-id>/<path-to-model-relative-to-the-root-of-the-artifact-location>
# Job: azureml://jobs/<job-name>/outputs/<output-name>/paths/<path-to-model-relative-to-the-named-output-location>
# Model Asset: azureml:<my_model>:<version>
command: |
ls ${{inputs.my_model}}
inputs:
my_model:
type: mlflow_model # List of all model types here: https://learn.microsoft.com/azure/machine-learning/reference-yaml-model#yaml-syntax
path: ../../assets/model/mlflow-model
environment: azureml:AzureML-sklearn-1.0-ubuntu20.04-py38-cpu@latest
Next, run in the CLI
az ml job create -f <file-name>.yml
For a complete example, see the model GitHub repo.
Use model as output in a job
In your job you can write model to your cloud-based storage using outputs.
Create a job specification YAML file (<file-name>.yml
), with the outputs
section populated with the type and path of where you would like to write your data to:
$schema: https://azuremlschemas.azureedge.net/latest/commandJob.schema.json
# Possible Paths for Model:
# Local path: mlflow-model/model.pkl
# AzureML Datastore: azureml://datastores/<datastore-name>/paths/<path_on_datastore>
# MLflow run: runs:/<run-id>/<path-to-model-relative-to-the-root-of-the-artifact-location>
# Job: azureml://jobs/<job-name>/outputs/<output-name>/paths/<path-to-model-relative-to-the-named-output-location>
# Model Asset: azureml:<my_model>:<version>
code: src
command: >-
python hello-model-as-output.py
--input_model ${{inputs.input_model}}
--custom_model_output ${{outputs.output_folder}}
inputs:
input_model:
type: mlflow_model # mlflow_model,custom_model, triton_model
path: ../../assets/model/mlflow-model
outputs:
output_folder:
type: custom_model # mlflow_model,custom_model, triton_model
environment: azureml:AzureML-sklearn-1.0-ubuntu20.04-py38-cpu@latest
Next create a job using the CLI:
az ml job create --file <file-name>.yml
For a complete example, see the model GitHub repo.
Next steps
Feedback
Submit and view feedback for