Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
3,041 questions
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Hi All
I have tried to look at the Semantic Kernel C# library.
Wonder if there is way to see the actual payload from Semantic Kernel library to Azure OpenAI.
For example, in below code blow, how do i see the full http payload sent to openai serivce.
static async Task Main(string[] args)
{
IKernelBuilder builder = Kernel.CreateBuilder();
builder.AddAzureOpenAIChatCompletion(
deploymentName: "your-deployment-name",
endpoint: "your-azure-openai-endpoint",
apiKey: "your-azure-openai-api-key"
);
Kernel kernel = builder.Build();
var answer = await kernel.InvokePromptAsync(
"questione?"
);
Console.WriteLine(answer);
}
Yes, it is possible to add a logger to the kernel. Try out the code below.
There are more details about the Semantic Kernel Observability here.
using var loggerFactory = LoggerFactory.Create(builder => {
builder.AddOpenTelemetry(options =>
{
options.SetResourceBuilder(resourceBuilder);
options.AddConsoleExporter();
options.IncludeFormattedMessage = true;
options.IncludeScopes = true;
});
builder.SetMinimumLevel(LogLevel.Information);
});
IKernelBuilder builder = Kernel.CreateBuilder();
builder.Services.AddSingleton(loggerFactory);
builder.AddAzureOpenAIChatCompletion( deploymentName: "your-deployment-name", endpoint: "your-azure-openai-endpoint", apiKey: "your-azure-openai-api-key" );
Kernel kernel = builder.Build();
var answer = await kernel.InvokePromptAsync( "question?" );
Console.WriteLine(answer);