Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
The Azure Developer CLI enables you to quickly and easily deploy to an Azure ML Studio or Microsoft Foundry online endpoint. azd supports the following Foundry/ML studio features, which you'll learn to configure in the sections ahead:
- Custom environments
- Environments can be viewed with Azure ML Studio under the
Environmentssection.
- Environments can be viewed with Azure ML Studio under the
- Custom models
- Models can be viewed with Azure ML Studio under the
modelssection.
- Models can be viewed with Azure ML Studio under the
- Prompt flows
- Flows can be viewed with Azure ML Studio under the
flowssection. - Flows can be viewed with Foundry portal under the
flowssection.
- Flows can be viewed with Azure ML Studio under the
- Online deployments (within Online-Endpoint)
- Deployments can be viewed with Azure ML Studio under the
deploymentssection. - Deployments can be viewed with Foundry portal under the
deploymentssection.
- Deployments can be viewed with Azure ML Studio under the
Prerequisites
To work with Foundry/ML studio online endpoints, you'll need the following:
- Azure Subscription with OpenAI access enabled
- AI Hub Resource
- AI Project
- OpenAI Service
- Online Endpoint
- AI Search Service (Optional, enabled by default)
The Foundry Starter template can help create all the required infrastructure to get started with Foundry endpoints.
Configure the Foundry/ML studio online endpoint
Configure support for AI/ML online endpoints in the services section of the azure.yaml file:
- Set the
hostvalue toai.endpoint. - The
configsection forai.endpointsupports the following configurations:- workspace: The name of the Foundry workspace. Supports
azdenvironment variable substitutions and syntax.- If not specified,
azdwill look for environment variable with nameAZUREAI_PROJECT_NAME.
- If not specified,
- environment: Optional custom configuration for ML environments.
azdcreates a new environment version from the referenced YAML file definition. - flow: Optional custom configuration for flows.
azdcreates a new prompt flow from the specified file path. - model: Optional custom configuration for ML models.
azdcreates a new model version from the referenced YAML file definition. - deployment: Required configuration for online endpoint deployments.
azdcreates a new online deployment to the associated online endpoint from the referenced YAML file definition.
- workspace: The name of the Foundry workspace. Supports
Consider the following sample azure.yaml file that configures these features:
name: contoso-chat
metadata:
template: contoso-chat@0.0.1-beta
services:
chat:
# Referenced new ai.endpoint host type
host: ai.endpoint
# New config flow for AI project configuration
config:
# The name of the Foundry workspace
workspace: ${AZUREAI_PROJECT_NAME}
# Optional: Path to custom ML environment manifest
environment:
path: deployment/docker/environment.yml
# Optional: Path to your prompt flow folder that contains the flow manifest
flow:
path: ./contoso-chat
# Optional: Path to custom model manifest
model:
path: deployment/chat-model.yaml
overrides:
"properties.azureml.promptflow.source_flow_id": ${AZUREAI_FLOW_NAME}
# Required: Path to deployment manifest
deployment:
path: deployment/chat-deployment.yaml
environment:
PRT_CONFIG_OVERRIDE: deployment.subscription_id=${AZURE_SUBSCRIPTION_ID},deployment.resource_group=${AZURE_RESOURCE_GROUP},deployment.workspace_name=${AZUREAI_PROJECT_NAME},deployment.endpoint_name=${AZUREAI_ENDPOINT_NAME},deployment.deployment_name=${AZUREAI_DEPLOYMENT_NAME}
The config.deployment section is required and creates a new online deployment to the associated online endpoint from the referenced yaml file definition. This functionality handles various concerns for you, including the following:
- Associates environment and model will be referenced when available.
azdwaits for deployment to enter a terminal provisioning state.- On successful deployments, all traffic is shifted to the new deployment version.
- All previous deployments, are deleted to free up compute for future deployments.
Explore configuration options
Each supported feature for AI/ML online endpoints supports customizations for your specific scenario using the options described in the following sections.
Flow
The flow configuration section is optional and supports the following values:
name: The name of the flow. Defaults to
<service-name>-flow-<timestamp>if not specified.path: The relative path to a folder that contains the flow manifest.
overrides: Any custom overrides to apply to the flow.
Note
Each call to
azd deploycreates a new timestamped flow.
Environment
The environment configuration section is optional and supports the following values:
name: The name of the custom environment. Defaults to
<service-name>-environmentif not specified.path: The relative path to a custom environment yaml manifest.
overrides: Any custom overrides to apply to the environment.
Note
Each call to
azd deploycreates a new environment version.
Model
The model configuration section is optional and supports following values:
name: The name of the custom model. Defaults to
<service-name>-modelif not specified.path: The relative path to a custom model yaml manifest.
overrides: Any custom overrides to apply to the model.
Note
Each call to
azd deploycreates a new environment version.
Deployment
The deployment configuration section is required and supports the following values:
name: The name of the custom deployment. Defaults to
<service-name>-deploymentif not specified.path: The relative path to a custom deployment yaml manifest.
environment: A map of key value pairs to set environment variables for the deployment. Supports environment variable substitutions from OS/AZD environment variables using
${VAR_NAME}syntax.overrides: Any custom overrides to apply to the deployment.
Note
Only supports managed online deployments.