可以通过将其连接到远程 模型上下文协议(MCP) 服务器上托管的工具(自带 MCP 服务器终结点)来扩展 Azure AI Foundry 代理的功能。
如何使用模型上下文协议工具
本部分介绍如何将 Azure Foundry (Azure AI)与托管模型上下文协议 (MCP) 服务器集成配合使用来创建 AI 代理。 该代理可以利用由 Azure Foundry 服务管理和执行的 MCP 工具,从而实现对外部资源的安全和受控访问。
主要功能
- 托管 MCP 服务器:MCP 服务器由 Azure AI Foundry 托管和管理,无需管理服务器基础结构
- 永久性代理:代理是创建和存储服务器端的,允许有状态会话
- 工具审批工作流:MCP 工具调用的可配置审批机制
工作原理
1. 环境设置
此示例需要两个环境变量:
-
AZURE_FOUNDRY_PROJECT_ENDPOINT:Azure AI Foundry 项目终结点 URL -
AZURE_FOUNDRY_PROJECT_MODEL_ID:模型部署名称(默认为“gpt-4.1-mini”)
var endpoint = Environment.GetEnvironmentVariable("AZURE_FOUNDRY_PROJECT_ENDPOINT")
?? throw new InvalidOperationException("AZURE_FOUNDRY_PROJECT_ENDPOINT is not set.");
var model = Environment.GetEnvironmentVariable("AZURE_FOUNDRY_PROJECT_MODEL_ID") ?? "gpt-4.1-mini";
2. 代理配置
代理配置了特定的说明和元数据:
const string AgentName = "MicrosoftLearnAgent";
const string AgentInstructions = "You answer questions by searching the Microsoft Learn content only.";
这会创建一个专用于使用 Microsoft Learn 文档回答问题的代理。
3. MCP 工具定义
此示例创建指向托管 MCP 服务器的 MCP 工具定义:
var mcpTool = new MCPToolDefinition(
serverLabel: "microsoft_learn",
serverUrl: "https://learn.microsoft.com/api/mcp");
mcpTool.AllowedTools.Add("microsoft_docs_search");
关键组件:
- serverLabel:MCP 服务器实例的唯一标识符
- serverUrl:托管 MCP 服务器的 URL
- AllowedTools:指定代理可以使用的 MCP 服务器中的哪些工具
4. 持久代理创建
代理是使用 Azure AI Foundry Persistent Agents SDK 在服务器端创建的:
var persistentAgentsClient = new PersistentAgentsClient(endpoint, new DefaultAzureCredential());
var agentMetadata = await persistentAgentsClient.Administration.CreateAgentAsync(
model: model,
name: AgentName,
instructions: AgentInstructions,
tools: [mcpTool]);
警告
DefaultAzureCredential 对于开发来说很方便,但在生产中需要仔细考虑。 在生产环境中,请考虑使用特定凭据(例如), ManagedIdentityCredential以避免延迟问题、意外凭据探测以及回退机制的潜在安全风险。
这会创建一个持久性代理,该代理:
- 驻留在 Azure AI Foundry 服务上
- 有权访问指定的 MCP 工具
- 可以跨多个交互维护聊天状态
5. 代理检索和执行
创建的代理作为 AIAgent 实例进行检索:
AIAgent agent = await persistentAgentsClient.GetAIAgentAsync(agentMetadata.Value.Id);
6. 工具资源配置
此示例使用审批设置配置工具资源:
var runOptions = new ChatClientAgentRunOptions()
{
ChatOptions = new()
{
RawRepresentationFactory = (_) => new ThreadAndRunOptions()
{
ToolResources = new MCPToolResource(serverLabel: "microsoft_learn")
{
RequireApproval = new MCPApproval("never"),
}.ToToolResources()
}
}
};
密钥配置:
- MCPToolResource:将 MCP 服务器实例链接到代理执行
-
RequireApproval:控制何时需要用户批准工具调用
-
"never":工具在未经批准的情况下自动执行 -
"always":所有工具调用都需要用户批准 - 还可以配置自定义审批规则
-
7. 代理执行
使用配置 MCP 工具调用代理并执行问题:
AgentSession session = await agent.CreateSessionAsync();
var response = await agent.RunAsync(
"Please summarize the Azure AI Agent documentation related to MCP Tool calling?",
session,
runOptions);
Console.WriteLine(response);
8. 清理
此示例演示了适当的资源清理:
await persistentAgentsClient.Administration.DeleteAgentAsync(agent.Id);
小窍门
有关完整的可运行示例,请参阅 .NET 示例 。
Azure AI Foundry 通过 Python 代理框架与模型上下文协议 (MCP) 服务器无缝集成。 该服务管理 MCP 服务器托管和执行,消除基础结构管理,同时提供对外部工具的安全受控访问。
环境设置
通过环境变量配置 Azure AI Foundry 项目凭据:
import os
from azure.identity.aio import AzureCliCredential
from agent_framework.azure import AzureAIAgentClient
# Required environment variables
os.environ["AZURE_AI_PROJECT_ENDPOINT"] = "https://<your-project>.services.ai.azure.com/api/projects/<project-id>"
os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"] = "gpt-4o-mini" # Optional, defaults to this
基本 MCP 集成
使用托管 MCP 工具创建 Azure AI Foundry 代理:
import asyncio
from agent_framework.azure import AzureAIAgentClient
from azure.identity.aio import AzureCliCredential
async def basic_foundry_mcp_example():
"""Basic example of Azure AI Foundry agent with hosted MCP tools."""
async with (
AzureCliCredential() as credential,
AzureAIAgentClient(async_credential=credential) as client,
):
# Create a hosted MCP tool using the client method
learn_mcp = client.get_mcp_tool(
name="Microsoft Learn MCP",
url="https://learn.microsoft.com/api/mcp",
)
# Create agent with hosted MCP tool
agent = client.as_agent(
name="MicrosoftLearnAgent",
instructions="You answer questions by searching Microsoft Learn content only.",
tools=learn_mcp,
)
# Simple query without approval workflow
result = await agent.run(
"Please summarize the Azure AI Agent documentation related to MCP tool calling?"
)
print(result)
if __name__ == "__main__":
asyncio.run(basic_foundry_mcp_example())
多工具 MCP 配置
将多个托管 MCP 工具与单个代理配合使用:
async def multi_tool_mcp_example():
"""Example using multiple hosted MCP tools."""
async with (
AzureCliCredential() as credential,
AzureAIAgentClient(async_credential=credential) as client,
):
# Create multiple MCP tools using the client method
learn_mcp = client.get_mcp_tool(
name="Microsoft Learn MCP",
url="https://learn.microsoft.com/api/mcp",
approval_mode="never_require", # Auto-approve documentation searches
)
github_mcp = client.get_mcp_tool(
name="GitHub MCP",
url="https://api.github.com/mcp",
approval_mode="always_require", # Require approval for GitHub operations
headers={"Authorization": "Bearer github-token"},
)
# Create agent with multiple MCP tools
agent = client.as_agent(
name="MultiToolAgent",
instructions="You can search documentation and access GitHub repositories.",
tools=[learn_mcp, github_mcp],
)
result = await agent.run(
"Find Azure documentation and also check the latest commits in microsoft/semantic-kernel"
)
print(result)
if __name__ == "__main__":
asyncio.run(multi_tool_mcp_example())
Python 代理框架提供与 Azure AI Foundry 托管的 MCP 功能的无缝集成,支持对外部工具的安全且可缩放的访问,同时保持生产应用程序所需的灵活性和控制。
完整示例
# Copyright (c) Microsoft. All rights reserved.
import asyncio
import os
from agent_framework import Agent
from agent_framework.openai import OpenAIResponsesClient
from dotenv import load_dotenv
"""
MCP GitHub Integration with Personal Access Token (PAT)
This example demonstrates how to connect to GitHub's remote MCP server using a Personal Access
Token (PAT) for authentication. The agent can use GitHub operations like searching repositories,
reading files, creating issues, and more depending on how you scope your token.
Prerequisites:
1. A GitHub Personal Access Token with appropriate scopes
- Create one at: https://github.com/settings/tokens
- For read-only operations, you can use more restrictive scopes
2. Environment variables:
- GITHUB_PAT: Your GitHub Personal Access Token (required)
- OPENAI_API_KEY: Your OpenAI API key (required)
- OPENAI_RESPONSES_MODEL_ID: Your OpenAI model ID (required)
"""
async def github_mcp_example() -> None:
"""Example of using GitHub MCP server with PAT authentication."""
# 1. Load environment variables from .env file if present
load_dotenv()
# 2. Get configuration from environment
github_pat = os.getenv("GITHUB_PAT")
if not github_pat:
raise ValueError(
"GITHUB_PAT environment variable must be set. Create a token at https://github.com/settings/tokens"
)
# 3. Create authentication headers with GitHub PAT
auth_headers = {
"Authorization": f"Bearer {github_pat}",
}
# 4. Create agent with the GitHub MCP tool using instance method
# The MCP tool manages the connection to the MCP server and makes its tools available
# Set approval_mode="never_require" to allow the MCP tool to execute without approval
client = OpenAIResponsesClient()
github_mcp_tool = client.get_mcp_tool(
name="GitHub",
url="https://api.githubcopilot.com/mcp/",
headers=auth_headers,
approval_mode="never_require",
)
# 5. Create agent with the GitHub MCP tool
async with Agent(
client=client,
name="GitHubAgent",
instructions=(
"You are a helpful assistant that can help users interact with GitHub. "
"You can search for repositories, read file contents, check issues, and more. "
"Always be clear about what operations you're performing."
),
tools=github_mcp_tool,
) as agent:
# Example 1: Get authenticated user information
query1 = "What is my GitHub username and tell me about my account?"
print(f"\nUser: {query1}")
result1 = await agent.run(query1)
print(f"Agent: {result1.text}")
# Example 2: List my repositories
query2 = "List all the repositories I own on GitHub"
print(f"\nUser: {query2}")
result2 = await agent.run(query2)
print(f"Agent: {result2.text}")
if __name__ == "__main__":
asyncio.run(github_mcp_example())