Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
This tutorial shows how to enable OpenTelemetry on an agent so that interactions with the agent are automatically logged and exported. In this tutorial, output is written to the console using the OpenTelemetry console exporter.
Note
For more information about the standards followed by Microsoft Agent Framework, see Semantic Conventions for GenAI agent and framework spans from Open Telemetry.
Prerequisites
For prerequisites, see the Create and run a simple agent step in this tutorial.
Install NuGet packages
To use Microsoft Agent Framework with Azure OpenAI, you need to install the following NuGet packages:
dotnet add package Azure.AI.OpenAI --prerelease
dotnet add package Azure.Identity
dotnet add package Microsoft.Agents.AI.OpenAI --prerelease
To also add OpenTelemetry support, with support for writing to the console, install these additional packages:
dotnet add package OpenTelemetry
dotnet add package OpenTelemetry.Exporter.Console
Enable OpenTelemetry in your app
Enable Agent Framework telemetry and create an OpenTelemetry TracerProvider that exports to the console.
The TracerProvider must remain alive while you run the agent so traces are exported.
using System;
using OpenTelemetry;
using OpenTelemetry.Trace;
// Create a TracerProvider that exports to the console
using var tracerProvider = Sdk.CreateTracerProviderBuilder()
.AddSource("agent-telemetry-source")
.AddConsoleExporter()
.Build();
Create and instrument the agent
Create an agent, and using the builder pattern, call UseOpenTelemetry to provide a source name.
Note that the string literal agent-telemetry-source is the OpenTelemetry source name
that you used when you created the tracer provider.
using System;
using Azure.AI.OpenAI;
using Azure.Identity;
using Microsoft.Agents.AI;
using OpenAI;
// Create the agent and enable OpenTelemetry instrumentation
AIAgent agent = new AzureOpenAIClient(
new Uri("https://<myresource>.openai.azure.com"),
new AzureCliCredential())
.GetChatClient("gpt-4o-mini")
.CreateAIAgent(instructions: "You are good at telling jokes.", name: "Joker")
.AsBuilder()
.UseOpenTelemetry(sourceName: "agent-telemetry-source")
.Build();
Run the agent and print the text response. The console exporter will show trace data on the console.
Console.WriteLine(await agent.RunAsync("Tell me a joke about a pirate."));
The expected output will be something like this, where the agent invocation trace is shown first, followed by the text response from the agent.
Activity.TraceId: f2258b51421fe9cf4c0bd428c87b1ae4
Activity.SpanId: 2cad6fc139dcf01d
Activity.TraceFlags: Recorded
Activity.DisplayName: invoke_agent Joker
Activity.Kind: Client
Activity.StartTime: 2025-09-18T11:00:48.6636883Z
Activity.Duration: 00:00:08.6077009
Activity.Tags:
gen_ai.operation.name: chat
gen_ai.request.model: gpt-4o-mini
gen_ai.provider.name: openai
server.address: <myresource>.openai.azure.com
server.port: 443
gen_ai.agent.id: 19e310a72fba4cc0b257b4bb8921f0c7
gen_ai.agent.name: Joker
gen_ai.response.finish_reasons: ["stop"]
gen_ai.response.id: chatcmpl-CH6fgKwMRGDtGNO3H88gA3AG2o7c5
gen_ai.response.model: gpt-4o-mini-2024-07-18
gen_ai.usage.input_tokens: 26
gen_ai.usage.output_tokens: 29
Instrumentation scope (ActivitySource):
Name: agent-telemetry-source
Resource associated with Activity:
telemetry.sdk.name: opentelemetry
telemetry.sdk.language: dotnet
telemetry.sdk.version: 1.13.1
service.name: unknown_service:Agent_Step08_Telemetry
Why did the pirate go to school?
Because he wanted to improve his "arrr-ticulation"! ?????
Next steps
This tutorial shows how to enable OpenTelemetry on an agent so that interactions with the agent are automatically logged and exported. In this tutorial, output is written to the console using the OpenTelemetry console exporter.
Prerequisites
For prerequisites, see the Create and run a simple agent step in this tutorial.
Install packages
To use Agent Framework with Azure OpenAI, you need to install the following packages. Agent Framework automatically includes all necessary OpenTelemetry dependencies:
pip install agent-framework --pre
The following OpenTelemetry packages are included by default:
opentelemetry-api
opentelemetry-sdk
opentelemetry-exporter-otlp-proto-grpc
opentelemetry-semantic-conventions-ai
If you want to export to Azure Monitor (Application Insights), you also need to install the azure-monitor-opentelemetry package:
pip install azure-monitor-opentelemetry
Enable OpenTelemetry in your app
Agent Framework provides a convenient setup_observability function that configures OpenTelemetry with sensible defaults.
By default, it exports to the console if no specific exporter is configured.
import asyncio
from agent_framework.observability import setup_observability
# Enable Agent Framework telemetry with console output (default behavior)
setup_observability(enable_sensitive_data=True)
Understanding setup_observability parameters
The setup_observability function accepts the following parameters to customize your observability configuration:
enable_otel(bool, optional): Enables OpenTelemetry tracing and metrics. Default isFalsewhen using environment variables only, but is assumedTruewhen callingsetup_observability()programmatically. When using environment variables, setENABLE_OTEL=true.enable_sensitive_data(bool, optional): Controls whether sensitive data like prompts, responses, function call arguments, and results are included in traces. Default isFalse. Set toTrueto see actual prompts and responses in your traces. Warning: Be careful with this setting as it might expose sensitive data in your logs. Can also be set viaENABLE_SENSITIVE_DATA=trueenvironment variable.otlp_endpoint(str, optional): The OTLP endpoint URL for exporting telemetry data. Default isNone. Commonly set tohttp://localhost:4317. This creates an OTLPExporter for spans, metrics, and logs. Can be used with any OTLP-compliant endpoint such as OpenTelemetry Collector, Aspire Dashboard, or other OTLP endpoints. Can also be set viaOTLP_ENDPOINTenvironment variable.applicationinsights_connection_string(str, optional): Azure Application Insights connection string for exporting to Azure Monitor. Default isNone. Creates AzureMonitorTraceExporter, AzureMonitorMetricExporter, and AzureMonitorLogExporter. You can find this connection string in the Azure portal under the "Overview" section of your Application Insights resource. Can also be set viaAPPLICATIONINSIGHTS_CONNECTION_STRINGenvironment variable. Requires installation of theazure-monitor-opentelemetrypackage.vs_code_extension_port(int, optional): Port number for the AI Toolkit or Azure AI Foundry VS Code extension. Default is4317. Allows integration with VS Code extensions for local development and debugging. Can also be set viaVS_CODE_EXTENSION_PORTenvironment variable.exporters(list, optional): Custom list of OpenTelemetry exporters for advanced scenarios. Default isNone. Allows you to provide your own configured exporters when the standard options don't meet your needs.
Important
When no exporters (either through parameters or environment variables or as explicit exporters) are provided, the console exporter is configured by default for local debugging.
Setup options
You can configure observability in three ways:
1. Environment variables (simplest approach):
export ENABLE_OTEL=true
export ENABLE_SENSITIVE_DATA=true
export OTLP_ENDPOINT=http://localhost:4317
Then in your code:
from agent_framework.observability import setup_observability
setup_observability() # Reads from environment variables
2. Programmatic configuration:
from agent_framework.observability import setup_observability
# note that ENABLE_OTEL is implied to be True when calling setup_observability programmatically
setup_observability(
enable_sensitive_data=True,
otlp_endpoint="http://localhost:4317",
applicationinsights_connection_string="InstrumentationKey=your_key"
)
3. Custom exporters (for advanced scenarios):
from agent_framework.observability import setup_observability
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.trace.export import ConsoleSpanExporter
custom_exporters = [
OTLPSpanExporter(endpoint="http://localhost:4317"),
ConsoleSpanExporter()
]
setup_observability(exporters=custom_exporters, enable_sensitive_data=True)
The setup_observability function sets the global tracer provider and meter provider, allowing you to create custom spans and metrics:
from agent_framework.observability import get_tracer, get_meter
tracer = get_tracer()
meter = get_meter()
with tracer.start_as_current_span("my_custom_span"):
# Your code here
pass
counter = meter.create_counter("my_custom_counter")
counter.add(1, {"key": "value"})
Create and run the agent
Create an agent using Agent Framework. The observability will be automatically enabled for the agent once setup_observability has been called.
from agent_framework import ChatAgent
from agent_framework.azure import AzureOpenAIChatClient
from azure.identity import AzureCliCredential
# Create the agent - telemetry is automatically enabled
agent = ChatAgent(
chat_client=AzureOpenAIChatClient(
credential=AzureCliCredential(),
model="gpt-4o-mini"
),
name="Joker",
instructions="You are good at telling jokes."
)
# Run the agent
result = await agent.run("Tell me a joke about a pirate.")
print(result.text)
The console exporter will show trace data on the console similar to the following:
{
"name": "invoke_agent Joker",
"context": {
"trace_id": "0xf2258b51421fe9cf4c0bd428c87b1ae4",
"span_id": "0x2cad6fc139dcf01d",
"trace_state": "[]"
},
"kind": "SpanKind.CLIENT",
"parent_id": null,
"start_time": "2025-09-25T11:00:48.663688Z",
"end_time": "2025-09-25T11:00:57.271389Z",
"status": {
"status_code": "UNSET"
},
"attributes": {
"gen_ai.operation.name": "invoke_agent",
"gen_ai.system": "openai",
"gen_ai.agent.id": "Joker",
"gen_ai.agent.name": "Joker",
"gen_ai.request.instructions": "You are good at telling jokes.",
"gen_ai.response.id": "chatcmpl-CH6fgKwMRGDtGNO3H88gA3AG2o7c5",
"gen_ai.usage.input_tokens": 26,
"gen_ai.usage.output_tokens": 29
}
}
Followed by the text response from the agent:
Why did the pirate go to school?
Because he wanted to improve his "arrr-ticulation"! ⛵
Understanding the telemetry output
Once observability is enabled, Agent Framework automatically creates the following spans:
invoke_agent <agent_name>: The top-level span for each agent invocation. Contains all other spans as children and includes metadata like agent ID, name, and instructions.chat <model_name>: Created when the agent calls the underlying chat model. Includes the prompt and response as attributes whenenable_sensitive_dataisTrue, along with token usage information.execute_tool <function_name>: Created when the agent calls a function tool. Contains function arguments and results as attributes whenenable_sensitive_dataisTrue.
The following metrics are also collected:
For chat operations:
gen_ai.client.operation.duration(histogram): Duration of each operation in secondsgen_ai.client.token.usage(histogram): Token usage in number of tokens
For function invocations:
agent_framework.function.invocation.duration(histogram): Duration of each function execution in seconds
Azure AI Foundry integration
If you're using Azure AI Foundry clients, there's a convenient method for automatic setup:
from agent_framework.azure import AzureAIAgentClient
from azure.identity import AzureCliCredential
agent_client = AzureAIAgentClient(
credential=AzureCliCredential(),
# endpoint and model_deployment_name can be taken from environment variables
# project_endpoint="https://<your-project>.foundry.azure.com"
# model_deployment_name="<your-deployment-name>"
)
# Automatically configures observability with Application Insights
await agent_client.setup_azure_ai_observability()
This method retrieves the Application Insights connection string from your Azure AI Foundry project and calls setup_observability automatically. If you want to use Foundry Telemetry with other types of agents, you can do the same thing with:
from agent_framework.observability import setup_observability
from azure.ai.projects import AIProjectClient
from azure.identity import AzureCliCredential
project_client = AIProjectClient(endpoint, credential=AzureCliCredential())
conn_string = project_client.telemetry.get_application_insights_connection_string()
setup_observability(applicationinsights_connection_string=conn_string)
Also see the relevant Foundry documentation.
Note
When using Azure Monitor for your telemetry, you need to install the azure-monitor-opentelemetry package explicitly, as it is not included by default with Agent Framework.
Next steps
For more advanced observability scenarios and examples, see the Agent Observability user guide and the observability samples in the GitHub repository.