Quickstart: Chat with Azure OpenAI models using your own data
Reference | Source code | Package (pypi) | Samples
The links above reference the OpenAI API for Python. There is no Azure-specific OpenAI Python SDK. Learn how to switch between the OpenAI services and Azure OpenAI services.
In this quickstart you can use your own data with Azure OpenAI models. Using Azure OpenAI's models on your data can provide you with a powerful conversational AI platform that enables faster and more accurate communication.
Prerequisites
An Azure subscription - Create one for free.
Access granted to Azure OpenAI in the desired Azure subscription.
Azure OpenAI requires registration and is currently only available to approved enterprise customers and partners. See Limited access to Azure OpenAI Service for more information. You can apply for access to Azure OpenAI by completing the form at https://aka.ms/oai/access. Open an issue on this repo to contact us if you have an issue.
An Azure OpenAI resource with a chat model deployed (for example, GPT-3 or GPT-4). For more information about model deployment, see the resource deployment guide.
- Your chat model can use version
gpt-35-turbo (0301)
,gpt-35-turbo-16k
,gpt-4
, andgpt-4-32k
. You can view or change your model version in Azure OpenAI Studio.
- Your chat model can use version
Be sure that you are assigned at least the Cognitive Services Contributor role for the Azure OpenAI resource.
Add your data using Azure OpenAI Studio
Navigate to Azure OpenAI Studio and sign-in with credentials that have access to your Azure OpenAI resource. During or after the sign-in workflow, select the appropriate directory, Azure subscription, and Azure OpenAI resource.
Select the Chat playground tile.
On the Assistant setup tile, select Add your data (preview) > + Add a data source.
In the pane that appears, select Upload files under Select data source. Select Upload files. Azure OpenAI needs both a storage resource and a search resource to access and index your data.
Tip
- See the following resource for more information:
- Data source options
- You can connect an existing Azure AI search index or Azure Cosmos DB for MongoDB vCore as a data source.
- supported file types and formats
- Data source options
- For documents and datasets with long text, we recommend using the available data preparation script.
For Azure OpenAI to access your storage account, you will need to turn on Cross-origin resource sharing (CORS). If CORS isn't already turned on for the Azure Blob storage resource, select Turn on CORS.
Select your Azure AI Search resource, and select the acknowledgment that connecting it will incur usage on your account. Then select Next.
- See the following resource for more information:
On the Upload files pane, select Browse for a file and select the files you want to upload. Then select Upload files. Then select Next.
On the Data management pane, you can choose whether to enable semantic search or vector search for your index.
Important
- Semantic search and vector search are subject to additional pricing.
- You can use keyword search as the search type for no additional cost.
- To enable vector search, you will need a
text-embedding-ada-002
deployment in your Azure OpenAI resource. - Currently Azure OpenAI on your data supports semantic search for English data only. Only enable semantic search if both your documents and use case are in English.
- Semantic search and vector search are subject to additional pricing.
Review the details you entered, and select Save and close. You can now chat with the model and it will use information from your data to construct the response.
Chat playground
Start exploring Azure OpenAI capabilities with a no-code approach through the chat playground. It's simply a text box where you can submit a prompt to generate a completion. From this page, you can quickly iterate and experiment with the capabilities.
You can experiment with the configuration settings such as temperature and pre-response text to improve the performance of your task. You can read more about each parameter in the REST API.
- Selecting the Generate button will send the entered text to the completions API and stream the results back to the text box.
- Select the Undo button to undo the prior generation call.
- Select the Regenerate button to complete an undo and generation call together.
Deploy your model
Once you're satisfied with the experience in Azure OpenAI studio, you can deploy a web app directly from the Studio by selecting the Deploy to button.
This gives you the option to either deploy the model as a standalone web application, or Power Virtual Agents if you're using your own data on the model.
As an example, if you choose to deploy a web app:
The first time you deploy a web app, you should select Create a new web app. Choose a name for the app, which will
become part of the app URL. For example, https://<appname>.azurewebsites.net
.
Select your subscription, resource group, location, and pricing plan for the published app. To update an existing app, select Publish to an existing web app and choose the name of your previous app from the dropdown menu.
If you choose to deploy a web app, see the important considerations for using it.
Retrieve required variables
To successfully make a call against Azure OpenAI, you need the following variables. This quickstart assumes you've uploaded your data to an Azure blob storage account and have an Azure AI Search index created. See Add your data using Azure AI studio
Variable name | Value |
---|---|
AOAIEndpoint |
This value can be found in the Keys & Endpoint section when examining your Azure OpenAI resource from the Azure portal. Alternatively, you can find the value in Azure AI studio > Chat playground > Code view. An example endpoint is: https://my-resoruce.openai.azure.com . |
AOAIKey |
This value can be found in Resource management > Keys & Endpoint section when examining your Azure OpenAI resource from the Azure portal. You can use either KEY1 or KEY2 . Always having two keys allows you to securely rotate and regenerate keys without causing a service disruption. |
AOAIDeploymentId |
This value corresponds to the custom name you chose for your deployment when you deployed a model. This value can be found under Resource Management > Deployments in the Azure portal or alternatively under Management > Deployments in Azure AI studio. |
SearchEndpoint |
This value can be found in the Overview section when examining your Azure AI Search resource from the Azure portal. |
SearchKey |
This value can be found in the Settings > Keys section when examining your Azure AI Search resource from the Azure portal. You can use either the primary admin key or secondary admin key. Always having two keys allows you to securely rotate and regenerate keys without causing a service disruption. |
SearchIndex |
This value corresponds to the name of the index you created to store your data. You can find it in the Overview section when examining your Azure AI Search resource from the Azure portal. |
Environment variables
setx AOAIEndpoint REPLACE_WITH_YOUR_AOAI_ENDPOINT_VALUE_HERE
setx AOAIKey REPLACE_WITH_YOUR_AOAI_KEY_VALUE_HERE
setx AOAIDeploymentId REPLACE_WITH_YOUR_AOAI_DEPLOYMENT_VALUE_HERE
setx SearchEndpoint REPLACE_WITH_YOUR_AZURE_SEARCH_RESOURCE_VALUE_HERE
setx SearchKey REPLACE_WITH_YOUR_AZURE_SEARCH_RESOURCE_KEY_VALUE_HERE
setx SearchIndex REPLACE_WITH_YOUR_INDEX_NAME_HERE
Create a new .NET Core application
In a console window (such as cmd, PowerShell, or Bash), use the dotnet new
command to create a new console app with the name azure-openai-quickstart
. This command creates a simple "Hello World" project with a single C# source file: Program.cs.
dotnet new console -n azure-openai-quickstart
Change your directory to the newly created app folder. You can build the application with:
dotnet build
The build output should contain no warnings or errors.
...
Build succeeded.
0 Warning(s)
0 Error(s)
...
Install the OpenAI .NET client library with:
dotnet add package Azure.AI.OpenAI --prerelease
From the project directory, open the Program.cs file and replace its contents with the following code:
Without response streaming
using Azure;
using Azure.AI.OpenAI;
using System.Text.Json;
using static System.Environment;
string azureOpenAIEndpoint = GetEnvironmentVariable("AOAIEndpoint");
string azureOpenAIKey = GetEnvironmentVariable("AOAIKey");
string searchEndpoint = GetEnvironmentVariable("SearchEndpoint");
string searchKey = GetEnvironmentVariable("SearchKey");
string searchIndex = GetEnvironmentVariable("SearchIndex");
string deploymentName = GetEnvironmentVariable("AOAIDeploymentId");
var client = new OpenAIClient(new Uri(azureOpenAIEndpoint), new AzureKeyCredential(azureOpenAIKey));
var chatCompletionsOptions = new ChatCompletionsOptions()
{
Messages =
{
new ChatMessage(ChatRole.User, "What are the differences between Azure Machine Learning and Azure AI services?"),
},
AzureExtensionsOptions = new AzureChatExtensionsOptions()
{
Extensions =
{
new AzureCognitiveSearchChatExtensionConfiguration()
{
SearchEndpoint = new Uri(searchEndpoint),
SearchKey = new AzureKeyCredential(searchKey),
IndexName = searchIndex,
},
}
}
};
Response<ChatCompletions> response = client.GetChatCompletions(deploymentName, chatCompletionsOptions);
ChatMessage responseMessage = response.Value.Choices[0].Message;
Console.WriteLine($"Message from {responseMessage.Role}:");
Console.WriteLine("===");
Console.WriteLine(responseMessage.Content);
Console.WriteLine("===");
Console.WriteLine($"Context information (e.g. citations) from chat extensions:");
Console.WriteLine("===");
foreach (ChatMessage contextMessage in responseMessage.AzureExtensionsContext.Messages)
{
string contextContent = contextMessage.Content;
try
{
var contextMessageJson = JsonDocument.Parse(contextMessage.Content);
contextContent = JsonSerializer.Serialize(contextMessageJson, new JsonSerializerOptions()
{
WriteIndented = true,
});
}
catch (JsonException)
{}
Console.WriteLine($"{contextMessage.Role}: {contextContent}");
}
Console.WriteLine("===");
Important
For production, use a secure way of storing and accessing your credentials like Azure Key Vault. For more information about credential security, see the Azure AI services security article.
dotnet run program.cs
Output
Answer from assistant:
===
Azure Machine Learning is a cloud-based service that provides tools and services to build, train, and deploy machine learning models. It offers a collaborative environment for data scientists, developers, and domain experts to work together on machine learning projects. Azure Machine Learning supports various programming languages, frameworks, and libraries, including Python, R, TensorFlow, and PyTorch [^1^].
===
Context information (e.g. citations) from chat extensions:
===
tool: {
"citations": [
{
"content": "...",
"id": null,
"title": "...",
"filepath": "...",
"url": "...",
"metadata": {
"chunking": "orignal document size=1011. Scores=3.6390076 and None.Org Highlight count=38."
},
"chunk_id": "2"
},
...
],
"intent": "[\u0022What are the differences between Azure Machine Learning and Azure AI services?\u0022]"
}
===
This will wait until the model has generated its entire response before printing the results. Alternatively, if you want to asynchronously stream the response and print the results, you can replace the contents of Program.cs with the code in the next example.
Async with streaming
using Azure;
using Azure.AI.OpenAI;
using System.Text.Json;
using static System.Environment;
string endpoint = GetEnvironmentVariable("AOAIEndpoint");
string key = GetEnvironmentVariable("AOAIKey");
var client = new OpenAIClient(new Uri(endpoint), new AzureKeyCredential(key));
string azureOpenAIEndpoint = GetEnvironmentVariable("AOAIEndpoint");
string azureOpenAIKey = GetEnvironmentVariable("AOAIKey");
string searchEndpoint = GetEnvironmentVariable("SearchEndpoint");
string searchKey = GetEnvironmentVariable("SearchKey");
string searchIndex = GetEnvironmentVariable("SearchIndex");
string deploymentName = GetEnvironmentVariable("AOAIDeploymentId");
var client = new OpenAIClient(new Uri(azureOpenAIEndpoint), new AzureKeyCredential(azureOpenAIKey));
var chatCompletionsOptions = new ChatCompletionsOptions()
{
Messages =
{
new ChatMessage(ChatRole.User, "What are the differences between Azure Machine Learning and Azure AI services?"),
},
AzureExtensionsOptions = new AzureChatExtensionsOptions()
{
Extensions =
{
new AzureCognitiveSearchChatExtensionConfiguration()
{
SearchEndpoint = new Uri(searchEndpoint),
SearchKey = new AzureKeyCredential(searchKey),
IndexName = searchIndex,
},
}
}
};
Response<StreamingChatCompletions> response = await client.GetChatCompletionsStreamingAsync(
deploymentName,
chatCompletionsOptions);
using StreamingChatCompletions streamingChatCompletions = response.Value;
await foreach (StreamingChatChoice streamingChatChoice in streamingChatCompletions.GetChoicesStreaming())
{
await foreach (ChatMessage chatMessage in streamingChatChoice.GetMessageStreaming())
{
if (chatMessage.Role != default)
{
Console.WriteLine($"Message from {chatMessage.Role}: ");
}
if (chatMessage.Content != default)
{
Console.Write(chatMessage.Content);
}
if (chatMessage.AzureExtensionsContext != default)
{
Console.WriteLine($"Context information (e.g. citations) from chat extensions:");
foreach (var contextMessage in chatMessage.AzureExtensionsContext.Messages)
{
string contextContent = contextMessage.Content;
try
{
var contextMessageJson = JsonDocument.Parse(contextMessage.Content);
contextContent = JsonSerializer.Serialize(contextMessageJson, new JsonSerializerOptions()
{
WriteIndented = true,
});
}
catch (JsonException)
{}
Console.WriteLine($"{contextMessage.Role}: {contextContent}");
}
}
}
}
Retrieve required variables
To successfully make a call against Azure OpenAI, you need the following variables. This quickstart assumes you've uploaded your data to an Azure blob storage account and have an Azure AI Search index created. See Add your data using Azure AI studio
Variable name | Value |
---|---|
AOAIEndpoint |
This value can be found in the Keys & Endpoint section when examining your Azure OpenAI resource from the Azure portal. Alternatively, you can find the value in Azure AI studio > Chat playground > Code view. An example endpoint is: https://my-resoruce.openai.azure.com . |
AOAIKey |
This value can be found in Resource management > Keys & Endpoint section when examining your Azure OpenAI resource from the Azure portal. You can use either KEY1 or KEY2 . Always having two keys allows you to securely rotate and regenerate keys without causing a service disruption. |
AOAIDeploymentId |
This value corresponds to the custom name you chose for your deployment when you deployed a model. This value can be found under Resource Management > Deployments in the Azure portal or alternatively under Management > Deployments in Azure AI studio. |
SearchEndpoint |
This value can be found in the Overview section when examining your Azure AI Search resource from the Azure portal. |
SearchKey |
This value can be found in the Settings > Keys section when examining your Azure AI Search resource from the Azure portal. You can use either the primary admin key or secondary admin key. Always having two keys allows you to securely rotate and regenerate keys without causing a service disruption. |
SearchIndex |
This value corresponds to the name of the index you created to store your data. You can find it in the Overview section when examining your Azure AI Search resource from the Azure portal. |
Environment variables
setx AOAIEndpoint REPLACE_WITH_YOUR_AOAI_ENDPOINT_VALUE_HERE
setx AOAIKey REPLACE_WITH_YOUR_AOAI_KEY_VALUE_HERE
setx AOAIDeploymentId REPLACE_WITH_YOUR_AOAI_DEPLOYMENT_VALUE_HERE
setx SearchEndpoint REPLACE_WITH_YOUR_AZURE_SEARCH_RESOURCE_VALUE_HERE
setx SearchKey REPLACE_WITH_YOUR_AZURE_SEARCH_RESOURCE_KEY_VALUE_HERE
setx SearchIndex REPLACE_WITH_YOUR_INDEX_NAME_HERE
Create a Node application
In a console window (such as cmd, PowerShell, or Bash), create a new directory for your app, and navigate to it. Then run the npm init
command to create a node application with a package.json file.
npm init
Install the client library
Install the Azure OpenAI client and Azure Identity libraries for JavaScript with npm:
npm install @azure/openai @azure/identity
Your app's package.json file will be updated with the dependencies.
Create a sample application
Open a command prompt where you want the new project, and create a new file named ChatWithOwnData.js. Copy the following code into the ChatWithOwnData.js file.
const { OpenAIClient } = require("@azure/openai");
const { DefaultAzureCredential } = require("@azure/identity")
// Set the Azure and Cognitive Search values from environment variables
const endpoint = process.env["AOAIEndpoint"];
const azureApiKey = process.env["AOAIKey"];
const searchEndpoint = process.env["SearchEndpoint"];
const searchKey = process.env["SearchKey"];
const searchIndex = process.env["SearchIndex"];
const deploymentId = process.env["AOAIDeploymentId"];
async function main() {
console.log("== Chat Using Your Own Data Sample ==");
const client = new OpenAIClient(endpoint, new AzureKeyCredential(azureApiKey));
const messages = [
{ role: "user", content: "What are the differences between Azure Machine Learning and Azure AI services?" },
];
// Get chat responses from Azure OpenAI deployment using your own data via Azure AI Search
const events = client.listChatCompletions(deploymentId, messages, {
azureExtensionOptions: {
extensions: [
{
type: "AzureCognitiveSearch",
parameters: {
endpoint: searchEndpoint,
key: searchKey,
indexName: searchIndex,
},
},
],
},
});
// Display chat responses
for await (const event of events) {
for (const choice of event.choices) {
const delta = choice.delta?.content;
const role = choice.delta?.role;
if (delta && role) {
console.log(`${role}: ${delta}`);
const contextMessages = choice.delta?.context?.messages;
if (!!contextMessages) {
console.log("===");
console.log("Context information (e.g. citations) from chat extensions:");
console.log("===");
for (const message of contextMessages) {
// Display context included with chat responses (such as citations)
console.log(message.content);
}
}
}
}
}
}
main().catch((err) => {
console.error("The sample encountered an error:", err);
});
module.exports = { main };
Important
For production, use a secure way of storing and accessing your credentials like Azure Key Vault. For more information about credential security, see the Azure AI services security article.
node.exe ChatWithOwnData.js
Output
== Chat With Your Own Data Sample ==
assistant: Azure Machine Learning is a cloud-based service that provides tools and services to build, train, and deploy machine learning models. It offers a collaborative environment for data scientists, developers, and domain experts to work together on machine learning projects. Azure Machine Learning supports various programming languages, frameworks, and libraries, including Python, R, TensorFlow, and PyTorch [^1^].
===
Context information (e.g. citations) from chat extensions:
===
tool: {
'citations': [
{
'content': '...',
'id': null,
'title': '...',
'filepath': '...',
'url': '...',
'metadata': {
"chunking': 'orignal document size=1011. Scores=3.6390076 and None.Org Highlight count=38.'
},
'chunk_id': '2'
},
...
],
'intent': '[\u0022What are the differences between Azure Machine Learning and Azure AI services?\u0022]'
}
Retrieve required variables
To successfully make a call against Azure OpenAI, you need the following variables. This quickstart assumes you've uploaded your data to an Azure blob storage account and have an Azure AI Search index created. See Add your data using Azure AI studio
Variable name | Value |
---|---|
AOAIEndpoint |
This value can be found in the Keys & Endpoint section when examining your Azure OpenAI resource from the Azure portal. Alternatively, you can find the value in Azure AI studio > Chat playground > Code view. An example endpoint is: https://my-resoruce.openai.azure.com . |
AOAIKey |
This value can be found in Resource management > Keys & Endpoint section when examining your Azure OpenAI resource from the Azure portal. You can use either KEY1 or KEY2 . Always having two keys allows you to securely rotate and regenerate keys without causing a service disruption. |
AOAIDeploymentId |
This value corresponds to the custom name you chose for your deployment when you deployed a model. This value can be found under Resource Management > Deployments in the Azure portal or alternatively under Management > Deployments in Azure AI studio. |
SearchEndpoint |
This value can be found in the Overview section when examining your Azure AI Search resource from the Azure portal. |
SearchKey |
This value can be found in the Settings > Keys section when examining your Azure AI Search resource from the Azure portal. You can use either the primary admin key or secondary admin key. Always having two keys allows you to securely rotate and regenerate keys without causing a service disruption. |
SearchIndex |
This value corresponds to the name of the index you created to store your data. You can find it in the Overview section when examining your Azure AI Search resource from the Azure portal. |
Environment variables
setx AOAIEndpoint REPLACE_WITH_YOUR_AOAI_ENDPOINT_VALUE_HERE
setx AOAIKey REPLACE_WITH_YOUR_AOAI_KEY_VALUE_HERE
setx AOAIDeploymentId REPLACE_WITH_YOUR_AOAI_DEPLOYMENT_VALUE_HERE
setx SearchEndpoint REPLACE_WITH_YOUR_AZURE_SEARCH_RESOURCE_VALUE_HERE
setx SearchKey REPLACE_WITH_YOUR_AZURE_SEARCH_RESOURCE_KEY_VALUE_HERE
setx SearchIndex REPLACE_WITH_YOUR_INDEX_NAME_HERE
Create a Python environment
- Create a new folder named openai-python for your project and a new Python code file named main.py. Change into that directory:
mkdir openai-python
cd openai-python
- Install the following Python Libraries:
pip install openai==0.28.1
pip install python-dotenv
Create the Python app
- From the project directory, open the main.py file and add the following code:
import os
import openai
import dotenv
import requests
dotenv.load_dotenv()
openai.api_base = os.environ.get("AOAIEndpoint")
# Azure OpenAI on your own data is only supported by the 2023-08-01-preview API version
openai.api_version = "2023-08-01-preview"
openai.api_type = 'azure'
openai.api_key = os.environ.get("AOAIKey")
def setup_byod(deployment_id: str) -> None:
"""Sets up the OpenAI Python SDK to use your own data for the chat endpoint.
:param deployment_id: The deployment ID for the model to use with your own data.
To remove this configuration, simply set openai.requestssession to None.
"""
class BringYourOwnDataAdapter(requests.adapters.HTTPAdapter):
def send(self, request, **kwargs):
request.url = f"{openai.api_base}/openai/deployments/{deployment_id}/extensions/chat/completions?api-version={openai.api_version}"
return super().send(request, **kwargs)
session = requests.Session()
# Mount a custom adapter which will use the extensions endpoint for any call using the given `deployment_id`
session.mount(
prefix=f"{openai.api_base}/openai/deployments/{deployment_id}",
adapter=BringYourOwnDataAdapter()
)
openai.requestssession = session
aoai_deployment_id = os.environ.get("AOAIDeploymentId")
setup_byod(aoai_deployment_id)
completion = openai.ChatCompletion.create(
messages=[{"role": "user", "content": "What are the differences between Azure Machine Learning and Azure AI services?"}],
deployment_id=os.environ.get("AOAIDeploymentId"),
dataSources=[ # camelCase is intentional, as this is the format the API expects
{
"type": "AzureCognitiveSearch",
"parameters": {
"endpoint": os.environ.get("SearchEndpoint"),
"key": os.environ.get("SearchKey"),
"indexName": os.environ.get("SearchIndex"),
}
}
]
)
print(completion)
Important
For production, use a secure way of storing and accessing your credentials like Azure Key Vault. For more information about credential security, see the Azure AI services security article.
- Execute the following command:
python main.py
The application prints the response in a JSON format suitable for use in many scenarios. It includes both answers to your query and citations from your uploaded files.
Retrieve required variables
To successfully make a call against Azure OpenAI, you need the following variables. This quickstart assumes you've uploaded your data to an Azure blob storage account and have an Azure AI Search index created. See Add your data using Azure AI studio
Variable name | Value |
---|---|
AOAIEndpoint |
This value can be found in the Keys & Endpoint section when examining your Azure OpenAI resource from the Azure portal. Alternatively, you can find the value in Azure AI studio > Chat playground > Code view. An example endpoint is: https://my-resoruce.openai.azure.com . |
AOAIKey |
This value can be found in Resource management > Keys & Endpoint section when examining your Azure OpenAI resource from the Azure portal. You can use either KEY1 or KEY2 . Always having two keys allows you to securely rotate and regenerate keys without causing a service disruption. |
AOAIDeploymentId |
This value corresponds to the custom name you chose for your deployment when you deployed a model. This value can be found under Resource Management > Deployments in the Azure portal or alternatively under Management > Deployments in Azure AI studio. |
SearchEndpoint |
This value can be found in the Overview section when examining your Azure AI Search resource from the Azure portal. |
SearchKey |
This value can be found in the Settings > Keys section when examining your Azure AI Search resource from the Azure portal. You can use either the primary admin key or secondary admin key. Always having two keys allows you to securely rotate and regenerate keys without causing a service disruption. |
SearchIndex |
This value corresponds to the name of the index you created to store your data. You can find it in the Overview section when examining your Azure AI Search resource from the Azure portal. |
Environment variables
setx AOAIEndpoint REPLACE_WITH_YOUR_AOAI_ENDPOINT_VALUE_HERE
setx AOAIKey REPLACE_WITH_YOUR_AOAI_KEY_VALUE_HERE
setx AOAIDeploymentId REPLACE_WITH_YOUR_AOAI_DEPLOYMENT_VALUE_HERE
setx SearchEndpoint REPLACE_WITH_YOUR_AZURE_SEARCH_RESOURCE_VALUE_HERE
setx SearchKey REPLACE_WITH_YOUR_AZURE_SEARCH_RESOURCE_KEY_VALUE_HERE
setx SearchIndex REPLACE_WITH_YOUR_INDEX_NAME_HERE
Example PowerShell commands
The Azure OpenAI chat models are optimized to work with inputs formatted as a conversation. The messages
variable passes an array of dictionaries with different roles in the conversation delineated by system, user, tool, and assistant. The dataSources
variable connects to your Azure Cognitive Search index, and enables Azure OpenAI models to respond using your data.
To trigger a response from the model, you should end with a user message indicating that it's the assistant's turn to respond.
Tip
There are several parameters you can use to change the model's response, such as temperature
or top_p
. See the reference documentation for more information.
# Azure OpenAI metadata variables
$openai = @{
api_key = $Env:AZURE_OPENAI_KEY
api_base = $Env:AZURE_OPENAI_ENDPOINT # your endpoint should look like the following https://YOUR_RESOURCE_NAME.openai.azure.com/
api_version = '2023-07-01-preview' # this may change in the future
name = 'YOUR-DEPLOYMENT-NAME-HERE' #This will correspond to the custom name you chose for your deployment when you deployed a model.
}
$acs = @{
search_endpoint = 'YOUR ACS ENDPOINT' # your endpoint should look like the following https://YOUR_RESOURCE_NAME.search.windows.net/
search_key = 'YOUR-ACS-KEY-HERE' # or use the Get-Secret cmdlet to retrieve the value
search_index = 'YOUR-INDEX-NAME-HERE' # the name of your ACS index
}
# Completion text
$body = @{
dataSources = @(
@{
type = 'AzureCognitiveSearch'
parameters = @{
endpoint = $acs.search_endpoint
key = $acs.search_key
indexName = $acs.search_index
}
}
)
messages = @(
@{
role = 'user'
content = 'How do you query REST using PowerShell'
}
)
} | convertto-json -depth 5
# Header for authentication
$headers = [ordered]@{
'api-key' = $openai.api_key
}
# Send a completion call to generate an answer
$url = "$($openai.api_base)/openai/deployments/$($openai.name)/extensions/chat/completions?api-version=$($openai.api_version)"
$response = Invoke-RestMethod -Uri $url -Headers $headers -Body $body -Method Post -ContentType 'application/json'
return $response.choices.messages[1].content
Example output
To query a RESTful web service using PowerShell, you can use the `Invoke-RestMethod` cmdlet. This cmdlet sends HTTP and HTTPS requests to RESTful web services and processes the response based on the data type.
Important
For production, use a secure way of storing and accessing your credentials like The PowerShell Secret Management with Azure Key Vault. For more information about credential security, see the Azure AI services security article.
Chat with your model using a web app
To start chatting with the Azure OpenAI model that uses your data, you can deploy a web app using Azure OpenAI studio or example code we provide on GitHub. This app deploys using Azure app service, and provides a user interface for sending queries. This app can be used Azure OpenAI models that use your data, or models that don't use your data. See the readme file in the repo for instructions on requirements, setup, and deployment. You can optionally customize the frontend and backend logic of the web app by making changes to the source code.
Retrieve required variables
To successfully make a call against Azure OpenAI, you need the following variables. This quickstart assumes you've uploaded your data to an Azure blob storage account and have an Azure AI Search index created. See Add your data using Azure AI studio
Variable name | Value |
---|---|
AOAIEndpoint |
This value can be found in the Keys & Endpoint section when examining your Azure OpenAI resource from the Azure portal. Alternatively, you can find the value in Azure AI studio > Chat playground > Code view. An example endpoint is: https://my-resoruce.openai.azure.com . |
AOAIKey |
This value can be found in Resource management > Keys & Endpoint section when examining your Azure OpenAI resource from the Azure portal. You can use either KEY1 or KEY2 . Always having two keys allows you to securely rotate and regenerate keys without causing a service disruption. |
AOAIDeploymentId |
This value corresponds to the custom name you chose for your deployment when you deployed a model. This value can be found under Resource Management > Deployments in the Azure portal or alternatively under Management > Deployments in Azure AI studio. |
SearchEndpoint |
This value can be found in the Overview section when examining your Azure AI Search resource from the Azure portal. |
SearchKey |
This value can be found in the Settings > Keys section when examining your Azure AI Search resource from the Azure portal. You can use either the primary admin key or secondary admin key. Always having two keys allows you to securely rotate and regenerate keys without causing a service disruption. |
SearchIndex |
This value corresponds to the name of the index you created to store your data. You can find it in the Overview section when examining your Azure AI Search resource from the Azure portal. |
Environment variables
setx AOAIEndpoint REPLACE_WITH_YOUR_AOAI_ENDPOINT_VALUE_HERE
setx AOAIKey REPLACE_WITH_YOUR_AOAI_KEY_VALUE_HERE
setx AOAIDeploymentId REPLACE_WITH_YOUR_AOAI_DEPLOYMENT_VALUE_HERE
setx SearchEndpoint REPLACE_WITH_YOUR_AZURE_SEARCH_RESOURCE_VALUE_HERE
setx SearchKey REPLACE_WITH_YOUR_AZURE_SEARCH_RESOURCE_KEY_VALUE_HERE
setx SearchIndex REPLACE_WITH_YOUR_INDEX_NAME_HERE
Create a Go environment
Create a new folder named openai-go for your project and a new Go code file named sample.go. Change into that directory:
mkdir openai-go cd openai-go
Install the following Go packages:
go get github.com/Azure/azure-sdk-for-go/sdk/ai/azopenai
Enable dependency tracking for your code.
go mod init example/azure-openai
Create the Go app
From the project directory, open the sample.go file and add the following code:
package main import ( "context" "fmt" "log" "os" "github.com/Azure/azure-sdk-for-go/sdk/ai/azopenai" "github.com/Azure/azure-sdk-for-go/sdk/azcore/to" ) func main() { azureOpenAIKey := os.Getenv("AOAIKey") modelDeploymentID := os.Getenv("AOAIDeploymentId") // Ex: "https://<your-azure-openai-host>.openai.azure.com" azureOpenAIEndpoint := os.Getenv("AOAIEndpoint") // Azure AI Search configuration searchIndex := os.Getenv("SearchIndex") searchEndpoint := os.Getenv("SearchEndpoint") searchAPIKey := os.Getenv("SearchKey") if azureOpenAIKey == "" || modelDeploymentID == "" || azureOpenAIEndpoint == "" || searchIndex == "" || searchEndpoint == "" || searchAPIKey == "" { fmt.Fprintf(os.Stderr, "Skipping example, environment variables missing\n") return } keyCredential, err := azopenai.NewKeyCredential(azureOpenAIKey) if err != nil { // TODO: Update the following line with your application specific error handling logic log.Fatalf("ERROR: %s", err) } // In Azure OpenAI you must deploy a model before you can use it in your client. For more information // see here: https://learn.microsoft.com/azure/cognitive-services/openai/how-to/create-resource client, err := azopenai.NewClientWithKeyCredential(azureOpenAIEndpoint, keyCredential, nil) if err != nil { // TODO: Update the following line with your application specific error handling logic log.Fatalf("ERROR: %s", err) } resp, err := client.GetChatCompletions(context.TODO(), azopenai.ChatCompletionsOptions{ Messages: []azopenai.ChatMessage{ {Content: to.Ptr("What are the differences between Azure Machine Learning and Azure AI services?"), Role: to.Ptr(azopenai.ChatRoleUser)}, }, MaxTokens: to.Ptr[int32](512), AzureExtensionsOptions: &azopenai.AzureChatExtensionOptions{ Extensions: []azopenai.AzureChatExtensionConfiguration{ { // This allows Azure OpenAI to use an Azure AI Search index. // // > Because the model has access to, and can reference specific sources to support its responses, answers are not only based on its pretrained knowledge // > but also on the latest information available in the designated data source. This grounding data also helps the model avoid generating responses // > based on outdated or incorrect information. // // Quote from here: https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/use-your-data Type: to.Ptr(azopenai.AzureChatExtensionTypeAzureCognitiveSearch), Parameters: azopenai.AzureCognitiveSearchChatExtensionConfiguration{ Endpoint: &searchEndpoint, IndexName: &searchIndex, Key: &searchAPIKey, }, }, }, }, Deployment: modelDeploymentID, }, nil) if err != nil { // TODO: Update the following line with your application specific error handling logic log.Fatalf("ERROR: %s", err) } // Contains contextual information from your Azure chat completion extensions, configured above in `AzureExtensionsOptions` msgContext := resp.Choices[0].Message.Context fmt.Fprintf(os.Stderr, "Extensions Context Role: %s\nExtensions Context (length): %d\n", *msgContext.Messages[0].Role, len(*msgContext.Messages[0].Content)) fmt.Fprintf(os.Stderr, "ChatRole: %s\nChat content: %s\n", *resp.Choices[0].Message.Role, *resp.Choices[0].Message.Content, ) }
Important
For production, use a secure way of storing and accessing your credentials like Azure Key Vault. For more information about credential security, see the Azure AI services security article.
Execute the following command:
go run sample.go
The application prints the response including both answers to your query and citations from your uploaded files.
Retrieve required variables
To successfully make a call against Azure OpenAI, you need the following variables. This quickstart assumes you've uploaded your data to an Azure blob storage account and have an Azure AI Search index created. See Add your data using Azure AI studio
Variable name | Value |
---|---|
AOAIEndpoint |
This value can be found in the Keys & Endpoint section when examining your Azure OpenAI resource from the Azure portal. Alternatively, you can find the value in Azure AI studio > Chat playground > Code view. An example endpoint is: https://my-resoruce.openai.azure.com . |
AOAIKey |
This value can be found in Resource management > Keys & Endpoint section when examining your Azure OpenAI resource from the Azure portal. You can use either KEY1 or KEY2 . Always having two keys allows you to securely rotate and regenerate keys without causing a service disruption. |
AOAIDeploymentId |
This value corresponds to the custom name you chose for your deployment when you deployed a model. This value can be found under Resource Management > Deployments in the Azure portal or alternatively under Management > Deployments in Azure AI studio. |
SearchEndpoint |
This value can be found in the Overview section when examining your Azure AI Search resource from the Azure portal. |
SearchKey |
This value can be found in the Settings > Keys section when examining your Azure AI Search resource from the Azure portal. You can use either the primary admin key or secondary admin key. Always having two keys allows you to securely rotate and regenerate keys without causing a service disruption. |
SearchIndex |
This value corresponds to the name of the index you created to store your data. You can find it in the Overview section when examining your Azure AI Search resource from the Azure portal. |
Environment variables
setx AOAIEndpoint REPLACE_WITH_YOUR_AOAI_ENDPOINT_VALUE_HERE
setx AOAIKey REPLACE_WITH_YOUR_AOAI_KEY_VALUE_HERE
setx AOAIDeploymentId REPLACE_WITH_YOUR_AOAI_DEPLOYMENT_VALUE_HERE
setx SearchEndpoint REPLACE_WITH_YOUR_AZURE_SEARCH_RESOURCE_VALUE_HERE
setx SearchKey REPLACE_WITH_YOUR_AZURE_SEARCH_RESOURCE_KEY_VALUE_HERE
setx SearchIndex REPLACE_WITH_YOUR_INDEX_NAME_HERE
Example cURL commands
The Azure OpenAI chat models are optimized to work with inputs formatted as a conversation. The messages
variable passes an array of dictionaries with different roles in the conversation delineated by system, user, tool, and assistant. The dataSources
variable connects to your Azure AI Search index, and enables Azure OpenAI models to respond using your data.
To trigger a response from the model, you should end with a user message indicating that it's the assistant's turn to respond.
Tip
There are several parameters you can use to change the model's response, such as temperature
or top_p
. See the reference documentation for more information.
curl -i -X POST $AOAIEndpoint/openai/deployments/$AOAIDeploymentId/extensions/chat/completions?api-version=2023-06-01-preview \
-H "Content-Type: application/json" \
-H "api-key: $AOAIKey" \
-d \
'
{
"dataSources": [
{
"type": "AzureCognitiveSearch",
"parameters": {
"endpoint": "'$SearchEndpoint'",
"key": "'$SearchKey'",
"indexName": "'$SearchIndex'"
}
}
],
"messages": [
{
"role": "user",
"content": "What are the differences between Azure Machine Learning and Azure AI services?"
}
]
}
'
Example output
{
"id": "12345678-1a2b-3c4e5f-a123-12345678abcd",
"model": "",
"created": 1684304924,
"object": "chat.completion",
"choices": [
{
"index": 0,
"messages": [
{
"role": "tool",
"content": "{\"citations\": [{\"content\": \"\\nAzure AI services are cloud-based artificial intelligence (AI) services...\", \"id\": null, \"title\": \"What is Azure AI services\", \"filepath\": null, \"url\": null, \"metadata\": {\"chunking\": \"orignal document size=250. Scores=0.4314117431640625 and 1.72564697265625.Org Highlight count=4.\"}, \"chunk_id\": \"0\"}], \"intent\": \"[\\\"Learn about Azure AI services.\\\"]\"}",
"end_turn": false
},
{
"role": "assistant",
"content": " \nAzure AI services are cloud-based artificial intelligence (AI) services that help developers build cognitive intelligence into applications without having direct AI or data science skills or knowledge. [doc1]. Azure Machine Learning is a cloud service for accelerating and managing the machine learning project lifecycle. [doc1].",
"end_turn": true
}
]
}
]
}
Chat with your model using a web app
To start chatting with the Azure OpenAI model that uses your data, you can deploy a web app using Azure OpenAI studio or example code we provide on GitHub. This app deploys using Azure app service, and provides a user interface for sending queries. This app can be used Azure OpenAI models that use your data, or models that don't use your data. See the readme file in the repo for instructions on requirements, setup, and deployment. You can optionally customize the frontend and backend logic of the web app by making changes to the source code.
Clean up resources
If you want to clean up and remove an OpenAI or Azure AI Search resource, you can delete the resource or resource group. Deleting the resource group also deletes any other resources associated with it.
Next steps
- Learn more about using your data in Azure OpenAI Service
- Chat app sample code on GitHub.
Feedback
Submit and view feedback for