[Azure OpenAI Studio] How to get citations parts from the open ai endpoint with custom data?

Ha Vu, SF-G-2 30 Reputation points
2023-11-22T14:52:13.1266667+00:00

Hello,

I've set up a ChatGPT model with my own data. Can I get the citations below the answer from the Chatbot like this in the Azure OpenAI Studio via code/endpoint?

User's image

I am using this code suggested from the Azure OpenAI Studio, it gives back reference, but it looks like "...[doc2]". Obviously, its just a string and not clickable like Azure OpenAI Studio

import openai, os, requests
2
3openai.api_type = "azure"
4# Azure OpenAI on your own data is only supported by the 2023-08-01-preview API version
5openai.api_version = "2023-08-01-preview"
6
7# Azure OpenAI setup
8openai.api_base = os.getenv("OPENAI_API_BASE") # Add your endpoint here
9openai.api_key = os.getenv("OPENAI_API_KEY") # Add your OpenAI API key here
10deployment_id = "gpt35-test" # Add your deployment ID here
11
12# Azure AI Search setup
13search_endpoint = os.getenv("SEARCH_ENDPOINT"); # Add your Azure AI Search endpoint here
14search_key = os.getenv("SEARCH_KEY"); # Add your Azure AI Search admin key here
15search_index_name = os.getenv("SEARCH_INDEX_NAME"); # Add your Azure AI Search index name here
16
17def setup_byod(deployment_id: str) -> None:
18    """Sets up the OpenAI Python SDK to use your own data for the chat endpoint.
19
20    :param deployment_id: The deployment ID for the model to use with your own data.
21
22    To remove this configuration, simply set openai.requestssession to None.
23    """
24
25    class BringYourOwnDataAdapter(requests.adapters.HTTPAdapter):
26
27        def send(self, request, **kwargs):
28            request.url = f"{openai.api_base}/openai/deployments/{deployment_id}/extensions/chat/completions?api-version={openai.api_version}"
29            return super().send(request, **kwargs)
30
31    session = requests.Session()
32
33    # Mount a custom adapter which will use the extensions endpoint for any call using the given `deployment_id`
34    session.mount(
35        prefix=f"{openai.api_base}/openai/deployments/{deployment_id}",
36        adapter=BringYourOwnDataAdapter()
37    )
38
39    openai.requestssession = session
40
41setup_byod(deployment_id)
42
43
44message_text = [{"role": "user", "content": "What are the differences between Azure Machine Learning and Azure AI services?"}]
45
46completion = openai.ChatCompletion.create(
47    messages=message_text,
48    deployment_id=deployment_id,
49    dataSources=[  # camelCase is intentional, as this is the format the API expects
50        {
51            "type": "AzureCognitiveSearch",
52            "parameters": {
53                "endpoint": search_endpoint,
54                "key": search_key,
55                "indexName": search_index_name,
56            }
57        }
58    ]
59)


But is there a chance to retrieve more citation values from the endpoint?

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
2,622 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Saurabh Sharma 23,791 Reputation points Microsoft Employee
    2023-11-22T23:10:08.04+00:00

    Hi @Ha Vu, SF-G-2

    Welcome to Microsoft Q&A! Thanks for posting the question.

    The sample code which you are referring to, is a sample code for your reference and it is not the actual Azure OpenAI Studio code. The actual chat playground code receives the citations from the response like you are seeing as python code response but the chat application customizes the reference document and the link after parsing it in the background, using code something like below which I took it from sample chat application repo.

    User's image

    This repo contains code for a simple chat webapp similar to Azure Open AI Studio Chat Playground that integrates with Azure OpenAI.

    Please let me know if you have any other questions.

    Thanks

    Saurabh

    1 person found this answer helpful.