Nota
L-aċċess għal din il-paġna jeħtieġ l-awtorizzazzjoni. Tista’ tipprova tidħol jew tibdel id-direttorji.
L-aċċess għal din il-paġna jeħtieġ l-awtorizzazzjoni. Tista’ tipprova tibdel id-direttorji.
Microsoft Agent Framework supports multiple OpenAI client types. In C#, that includes Chat Completion, Responses, and Assistants. In Python, the provider-leading OpenAI surfaces are Chat Completion and Responses:
| Client Type | API | Best For |
|---|---|---|
| Chat Completion | Chat Completions API | Simple agents, broad model support |
| Responses | Responses API | Full-featured agents with hosted tools (code interpreter, file search, web search, hosted MCP) |
| Assistants | Assistants API | Server-managed agents with code interpreter and file search |
Language availability varies. Python uses the Chat Completion and Responses clients on this page; the Assistants coverage below is C# only.
Getting Started
Add the required NuGet packages to your project.
dotnet add package Microsoft.Agents.AI.OpenAI --prerelease
Chat Completion Client
The Chat Completion client provides a straightforward way to create agents using the ChatCompletion API.
using Microsoft.Agents.AI;
using OpenAI;
OpenAIClient client = new OpenAIClient("<your_api_key>");
var chatClient = client.GetChatClient("gpt-4o-mini");
AIAgent agent = chatClient.AsAIAgent(
instructions: "You are good at telling jokes.",
name: "Joker");
Console.WriteLine(await agent.RunAsync("Tell me a joke about a pirate."));
Supported tools: Function tools, web search, local MCP tools.
Responses Client
The Responses client provides the richest tool support including code interpreter, file search, web search, and hosted MCP.
using Microsoft.Agents.AI;
using OpenAI;
OpenAIClient client = new OpenAIClient("<your_api_key>");
var responsesClient = client.GetResponseClient("gpt-4o-mini");
AIAgent agent = responsesClient.AsAIAgent(
instructions: "You are a helpful coding assistant.",
name: "CodeHelper");
Console.WriteLine(await agent.RunAsync("Write a Python function to sort a list."));
Supported tools: Function tools, tool approval, code interpreter, file search, web search, hosted MCP, local MCP tools.
Assistants Client
The Assistants client creates server-managed agents with built-in code interpreter and file search.
using Microsoft.Agents.AI;
using OpenAI;
OpenAIClient client = new OpenAIClient("<your_api_key>");
var assistantsClient = client.GetAssistantClient();
// Assistants are managed server-side
AIAgent agent = assistantsClient.AsAIAgent(
instructions: "You are a data analysis assistant.",
name: "DataHelper");
Console.WriteLine(await agent.RunAsync("Analyze trends in the uploaded data."));
Supported tools: Function tools, code interpreter, file search, local MCP tools.
Tip
See the .NET samples for complete runnable examples.
Using the Agent
All three client types produce a standard AIAgent that supports the same agent operations (streaming, threads, middleware).
For more information, see the Get Started tutorials.
Tip
In Python, Azure OpenAI now uses the same agent_framework.openai clients shown here. Pass explicit Azure routing inputs such as credential or azure_endpoint when you want Azure routing, then set api_version for the Azure API surface you want to use. If OPENAI_API_KEY is configured, the generic clients stay on OpenAI even when AZURE_OPENAI_* variables are also present. If you already have a full .../openai/v1 URL, use base_url instead of azure_endpoint. For Microsoft Foundry project endpoints and the Foundry Agent Service, see the Microsoft Foundry provider page. For local runtimes, see Foundry Local.
Installation
pip install agent-framework-openai
agent-framework-openai is the optional Python provider package for both direct OpenAI and Azure OpenAI usage.
Configuration
The Python OpenAI chat clients use these environment-variable patterns:
OPENAI_API_KEY="your-openai-api-key"
OPENAI_CHAT_COMPLETION_MODEL="gpt-4o-mini"
# Optional shared fallback:
# OPENAI_MODEL="gpt-4o-mini"
Common Features
These client types support these standard agent features:
Function Tools
from agent_framework import tool
@tool
def get_weather(location: str) -> str:
"""Get the weather for a given location."""
return f"The weather in {location} is sunny, 25°C."
async def example():
agent = OpenAIChatClient().as_agent(
instructions="You are a weather assistant.",
tools=get_weather,
)
result = await agent.run("What's the weather in Tokyo?")
print(result)
Multi-Turn Conversations
async def thread_example():
agent = OpenAIChatClient().as_agent(
instructions="You are a helpful assistant.",
)
session = await agent.create_session()
result1 = await agent.run("My name is Alice", session=session)
print(result1)
result2 = await agent.run("What's my name?", session=session)
print(result2) # Remembers "Alice"
Streaming
async def streaming_example():
agent = OpenAIChatClient().as_agent(
instructions="You are a creative storyteller.",
)
print("Agent: ", end="", flush=True)
async for chunk in agent.run("Tell me a short story about AI.", stream=True):
if chunk.text:
print(chunk.text, end="", flush=True)
print()
Using the Agent
All client types produce a standard Agent that supports the same operations.
For more information, see the Get Started tutorials.