Azure ML Package client library for Python - version 1.15.0

We are excited to introduce the GA of Azure Machine Learning Python SDK v2. The Python SDK v2 introduces new SDK capabilities like standalone local jobs, reusable components for pipelines and managed online/batch inferencing. Python SDK v2 allows you to move from simple to complex tasks easily and incrementally. This is enabled by using a common object model which brings concept reuse and consistency of actions across various tasks. The SDK v2 shares its foundation with the CLI v2 which is also GA.

Source code | Package (PyPI) | Package (Conda) | API reference documentation | Product documentation | Samples

This package has been tested with Python 3.8, 3.9, 3.10, 3.11 and 3.12.

For a more complete set of Azure libraries, see https://aka.ms/azsdk/python/all

Getting started

Prerequisites

Install the package

Install the Azure ML client library for Python with pip:

pip install azure-ai-ml
pip install azure-identity

Authenticate the client

from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential

ml_client = MLClient(
    DefaultAzureCredential(), subscription_id, resource_group, workspace
)

Key concepts

Azure Machine Learning Python SDK v2 comes with many new features like standalone local jobs, reusable components for pipelines and managed online/batch inferencing. The SDK v2 brings consistency and ease of use across all assets of the platform. The Python SDK v2 offers the following capabilities:

  • Run Standalone Jobs - run a discrete ML activity as Job. This job can be run locally or on the cloud. We currently support the following types of jobs:
    • Command - run a command (Python, R, Windows Command, Linux Shell etc.)
    • Sweep - run a hyperparameter sweep on your Command
  • Run multiple jobs using our improved Pipelines
    • Run a series of commands stitched into a pipeline (New)
    • Components - run pipelines using reusable components (New)
  • Use your models for Managed Online inferencing (New)
  • Use your models for Managed batch inferencing
  • Manage AML resources – workspace, compute, datastores
  • Manage AML assets - Datasets, environments, models
  • AutoML - run standalone AutoML training for various ml-tasks:
    • Classification (Tabular data)
    • Regression (Tabular data)
    • Time Series Forecasting (Tabular data)
    • Image Classification (Multi-class) (New)
    • Image Classification (Multi-label) (New)
    • Image Object Detection (New)
    • Image Instance Segmentation (New)
    • NLP Text Classification (Multi-class) (New)
    • NLP Text Classification (Multi-label) (New)
    • NLP Text Named Entity Recognition (NER) (New)

Examples

Troubleshooting

General

Azure ML clients raise exceptions defined in Azure Core.

from azure.core.exceptions import HttpResponseError

try:
    ml_client.compute.get("cpu-cluster")
except HttpResponseError as error:
    print("Request failed: {}".format(error.message))

Logging

This library uses the standard logging library for logging. Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO level.

Detailed DEBUG level logging, including request/response bodies and unredacted headers, can be enabled on a client with the logging_enable argument.

See full SDK logging documentation with examples here.

Telemetry

The Azure ML Python SDK includes a telemetry feature that collects usage and failure data about the SDK and sends it to Microsoft when you use the SDK in a Jupyter Notebook only. Telemetry will not be collected for any use of the Python SDK outside of a Jupyter Notebook.

Telemetry data helps the SDK team understand how the SDK is used so it can be improved and the information about failures helps the team resolve problems and fix bugs. The SDK telemetry feature is enabled by default for Jupyter Notebook usage and cannot be enabled for non-Jupyter scenarios. To opt out of the telemetry feature in a Jupyter scenario, pass in enable_telemetry=False when constructing your MLClient object.

Next steps

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.