Edit

Share via


Configure a connection to use Microsoft Foundry Models in your AI project

Note

This document refers to the Microsoft Foundry (classic) portal.

🔍 View the Microsoft Foundry (new) documentation to learn about the new portal.

You can use Microsoft Foundry Models in your projects in Foundry to create rich applications and interact/manage the models available. To use the Foundry Models service in your project, you need to create a connection to the Foundry resource (formerly known Azure AI Services).

The following article explains how to create a connection to the Foundry resource (formerly known Azure AI Services) to use Foundry Models.

A diagram with the overall architecture of Azure Marketplace integration with Foundry Models.

Prerequisites

To complete this article, you need:

  • An AI project resource.

  • The Deploy models to Azure AI model inference service feature is turned on.

    An animation showing how to turn on the Deploy models to Azure AI model inference service feature in Microsoft Foundry portal.

Add a connection

You can create a connection to a Foundry Tools resource using the following steps:

  1. Go to Foundry portal.

  2. In the lower left corner of the screen, select Management center.

  3. In the section Connected resources select New connection.

  4. Select Foundry Tools.

  5. In the browser, look for an existing Foundry Tools resource in your subscription.

  6. Select Add connection.

  7. The new connection is added to your Hub.

  8. Return to the project's landing page to continue and now select the new created connection. Refresh the page if it doesn't show up immediately.

    Screenshot of the landing page for the project, highlighting the location of the connected resource and the associated inference endpoint.

See model deployments in the connected resource

You can see the model deployments available in the connected resource by following these steps:

  1. Go to Foundry portal.

  2. On the left pane, select Models + endpoints.

  3. The page displays the model deployments available to your, grouped by connection name. Locate the connection you have just created, which should be of type Foundry Tools.

    Screenshot showing the list of models available under a given connection.

  4. Select any model deployment you want to inspect.

  5. The details page shows information about the specific deployment. If you want to test the model, you can use the option Open in playground.

  6. The Foundry playground is displayed, where you can interact with the given model.

You can use Microsoft Foundry Models in your projects in Foundry to create rich applications and interact/manage the models available. To use the Foundry Models service in your project, you need to create a connection to the Foundry resource (formerly known Azure AI Services).

The following article explains how to create a connection to the Foundry resource (formerly known Azure AI Services) to use Foundry Models.

A diagram with the overall architecture of Azure Marketplace integration with Foundry Models.

Prerequisites

To complete this article, you need:

  • Install the Azure CLI and the ml extension for Microsoft Foundry:

    az extension add -n ml
    
  • Identify the following information:

    • Your Azure subscription ID.

    • Your Foundry Tools resource name.

    • The resource group where the Foundry Tools resource is deployed.

Add a connection

To add a model, you first need to identify the model that you want to deploy. You can query the available models as follows:

  1. Log in into your Azure subscription:

    az login
    
  2. Configure the CLI to point to the project:

    az account set --subscription <subscription>
    az configure --defaults workspace=<project-name> group=<resource-group> location=<location>
    
  3. Create a connection definition:

    connection.yml

    name: <connection-name>
    type: aiservices
    endpoint: https://<ai-services-resourcename>.services.ai.azure.com
    api_key: <resource-api-key>
    
  4. Create the connection:

    az ml connection create -f connection.yml
    
  5. At this point, the connection is available for consumption.

You can use Microsoft Foundry Models in your projects in Foundry to create rich applications and interact/manage the models available. To use the Foundry Models service in your project, you need to create a connection to the Foundry resource (formerly known Azure AI Services).

The following article explains how to create a connection to the Foundry resource (formerly known Azure AI Services) to use Foundry Models.

A diagram with the overall architecture of Azure Marketplace integration with Foundry Models.

Prerequisites

To complete this article, you need:

  • A Foundry project with an AI Hub.

  • Install the Azure CLI.

  • Identify the following information:

    • Your Azure subscription ID.

    • Your Foundry Tools resource name.

    • Your Foundry Tools resource ID.

    • The name of the Azure AI Hub where the project is deployed.

    • The resource group where the Foundry Tools resource is deployed.

Add a connection

  1. Use the template ai-services-connection-template.bicep to describe connection:

    ai-services-connection-template.bicep

    @description('Name of the hub where the connection will be created')
    param hubName string
    
    @description('Name of the connection')
    param name string
    
    @description('Category of the connection')
    param category string = 'AIServices'
    
    @allowed(['AAD', 'ApiKey', 'ManagedIdentity', 'None'])
    param authType string = 'AAD'
    
    @description('The endpoint URI of the connected service')
    param endpointUri string
    
    @description('The resource ID of the connected service')
    param resourceId string = ''
    
    @secure()
    param key string = ''
    
    
    resource connection 'Microsoft.MachineLearningServices/workspaces/connections@2024-04-01-preview' = {
      name: '${hubName}/${name}'
      properties: {
        category: category
        target: endpointUri
        authType: authType
        isSharedToAll: true
        credentials: authType == 'ApiKey' ? {
          key: key
        } : null
        metadata: {
          ApiType: 'Azure'
          ResourceId: resourceId
        }
      }
    }
    
  2. Run the deployment:

    RESOURCE_GROUP="<resource-group-name>"
    ACCOUNT_NAME="<azure-ai-model-inference-name>" 
    ENDPOINT_URI="https://<azure-ai-model-inference-name>.services.ai.azure.com"
    RESOURCE_ID="<resource-id>"
    HUB_NAME="<hub-name>"
    
    az deployment group create \
        --resource-group $RESOURCE_GROUP \
        --template-file ai-services-connection-template.bicep \
        --parameters accountName=$ACCOUNT_NAME hubName=$HUB_NAME endpointUri=$ENDPOINT_URI resourceId=$RESOURCE_ID
    

Next steps