Admin access to prompts sent to my endpoint

Snowder 0 Reputation points

Hello everyone,

I have deployed an Azure OpenAI Service endpoint for GPT-3.5.

For several compliance reasons, I would like to know how one could have access to prompts sent to my endpoint.

It seems that there is no built-in recording of prompts sent to my endpoint, where I could easily access them later.

I see two ways to make it work:

  • Either we deploy Azure OpenAI internally, and users first send data to a proxy server that I setup, which will log prompts, before calling my Azure OpenAI Service endpoint.
  • Azure OpenAI Service has a direct way to register prompts and those can be inspected manually / programmatically.

As the second would be a feature, do you think it's something that will be implemented any time soon?



Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
1,411 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Pramod Valavala 19,356 Reputation points Microsoft Employee

    @Snowder The first option would be the way to go for now. The sample ChatGPT-style web app implements this using CosmosDB, which you could use for reference.

    This is a similar default for most APIs within Azure, where in the request/response payloads are not usually logged. This is for both privacy of users and freedom to integrate in whatever way developers need to.

    While there is no feature like it today, feel free to raise a feature request to it here with your scenario for the team to assess it.

    0 comments No comments