Admin access to prompts sent to my endpoint

Snowder 0 Reputation points
2023-11-09T18:17:34.25+00:00

Hello everyone,

I have deployed an Azure OpenAI Service endpoint for GPT-3.5.

For several compliance reasons, I would like to know how one could have access to prompts sent to my endpoint.

It seems that there is no built-in recording of prompts sent to my endpoint, where I could easily access them later.

I see two ways to make it work:

  • Either we deploy Azure OpenAI internally, and users first send data to a proxy server that I setup, which will log prompts, before calling my Azure OpenAI Service endpoint.
  • Azure OpenAI Service has a direct way to register prompts and those can be inspected manually / programmatically.

As the second would be a feature, do you think it's something that will be implemented any time soon?

Best,

Daniel

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
3,070 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Pramod Valavala 20,626 Reputation points Microsoft Employee
    2023-11-09T23:21:18.6+00:00

    @Snowder The first option would be the way to go for now. The sample ChatGPT-style web app implements this using CosmosDB, which you could use for reference.

    This is a similar default for most APIs within Azure, where in the request/response payloads are not usually logged. This is for both privacy of users and freedom to integrate in whatever way developers need to.

    While there is no feature like it today, feel free to raise a feature request to it here with your scenario for the team to assess it.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.