Trace Semantic Kernel raw payload to/from Azure OpenAI services

Bo Bo 20 Reputation points
2024-09-18T09:52:25.36+00:00

Hi All

I have tried to look at the Semantic Kernel C# library.

Wonder if there is way to see the actual payload from Semantic Kernel library to Azure OpenAI.

For example, in below code blow, how do i see the full http payload sent to openai serivce.

        static async Task Main(string[] args)
        {
            IKernelBuilder builder = Kernel.CreateBuilder();
            builder.AddAzureOpenAIChatCompletion(
                deploymentName: "your-deployment-name",
                endpoint: "your-azure-openai-endpoint",
                apiKey: "your-azure-openai-api-key"
            );

            Kernel kernel = builder.Build();

            var answer = await kernel.InvokePromptAsync(                 
				"questione?"             
			);

            Console.WriteLine(answer);
        }

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
3,041 questions
0 comments No comments
{count} votes

Accepted answer
  1. Daniel FANG 545 Reputation points
    2024-09-18T10:00:14.1266667+00:00

    Yes, it is possible to add a logger to the kernel. Try out the code below.

    There are more details about the Semantic Kernel Observability here.

    https://learn.microsoft.com/en-us/semantic-kernel/enterprise-readiness/observability/telemetry-with-console?tabs=Powershell-CreateFile%2CEnvironmentFile&pivots=programming-language-csharp

    
    using var loggerFactory = LoggerFactory.Create(builder => {
         builder.AddOpenTelemetry(options =>
        {
            options.SetResourceBuilder(resourceBuilder);
            options.AddConsoleExporter(); 
            options.IncludeFormattedMessage = true;
            options.IncludeScopes = true;
        });
        builder.SetMinimumLevel(LogLevel.Information);
    });
    
    IKernelBuilder builder = Kernel.CreateBuilder(); 
    builder.Services.AddSingleton(loggerFactory);
    builder.AddAzureOpenAIChatCompletion( deploymentName: "your-deployment-name", endpoint: "your-azure-openai-endpoint", apiKey: "your-azure-openai-api-key" ); 
    
    Kernel kernel = builder.Build(); 
    var answer = await kernel.InvokePromptAsync( "question?" ); 
    Console.WriteLine(answer);
    
    
    
    1 person found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.