Hi uehara ryo,
Which model you are trying to use? Can you please share the code which you are using to make call while interacting with Azure cognitive search.
I have tested this with GPT 3.5 Turbo 0301 model and it behaves fine when I pass "Hi" to chat completion endpoint. Here is the python code which I have used -
import openai, os, requests
openai.api_type = "azure"
# Azure OpenAI on your own data is only supported by the 2023-08-01-preview API version
openai.api_version = "2023-08-01-preview"
# Azure OpenAI setup
openai.api_base = "https://{openai}.openai.azure.com/" # Add your endpoint here
openai.api_key = os.getenv("AOAIKey") # Add your OpenAI API key here
deployment_id = "gpt35turbo" # Add your deployment ID here
# Azure Cognitive Search setup
search_endpoint = "https://{conginitivesearch}.search.windows.net"; # Add your Azure Cognitive Search endpoint here
search_key = "{searchkey}"; # Add your Azure Cognitive Search admin key here
search_index_name = "azureblob-index"; # Add your Azure Cognitive Search index name here
def setup_byod(deployment_id: str) -> None:
"""Sets up the OpenAI Python SDK to use your own data for the chat endpoint.
:param deployment_id: The deployment ID for the model to use with your own data.
To remove this configuration, simply set openai.requestssession to None.
"""
class BringYourOwnDataAdapter(requests.adapters.HTTPAdapter):
def send(self, request, **kwargs):
request.url = f"{openai.api_base}/openai/deployments/{deployment_id}/extensions/chat/completions?api-version={openai.api_version}"
return super().send(request, **kwargs)
session = requests.Session()
# Mount a custom adapter which will use the extensions endpoint for any call using the given `deployment_id`
session.mount(
prefix=f"{openai.api_base}/openai/deployments/{deployment_id}",
adapter=BringYourOwnDataAdapter()
)
openai.requestssession = session
setup_byod(deployment_id)
completion = openai.ChatCompletion.create(
messages=[{"role": "user", "content": "Hi"}],
deployment_id=deployment_id,
dataSources=[ # camelCase is intentional, as this is the format the API expects
{
"type": "AzureCognitiveSearch",
"parameters": {
"endpoint": search_endpoint,
"key": search_key,
"indexName": search_index_name,
}
}
]
)
print(completion)
Output:
Also, I ask a question from the document, it gives the answer if available in the documentation. See example below-
I took this code from the Chat Playground, after setting up Azure Blob Storage with an Azure Search.
Additionally, if you want to enforce the model to look for your data only, then you need to Check - "Limit responses to your data content".
You can achieve the same in API call by setting up the inScope
to true. See below code snippet-
completion = openai.ChatCompletion.create(
messages=[{"role": "user", "content": "what is the population of USA?"}],
deployment_id=deployment_id,
dataSources=[ # camelCase is intentional, as this is the format the API expects
{
"type": "AzureCognitiveSearch",
"parameters": {
"endpoint": search_endpoint,
"key": search_key,
"indexName": search_index_name,
"inScope": true
}
}
]
)
Please refer to the documentation - Using your data(Preview) for details.
Please let me know if you have any other questions.
Thanks
Saurabh