Events
Microsoft 365 Community Conference
6 May, 2 pm - 9 May, 12 am
Skill up for the era of AI at the ultimate community-led Microsoft 365 event, May 6-8 in Las Vegas.
Learn moreThis browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
When you build apps connected to Azure OpenAI, often, only a portion of the app interacts with the Azure OpenAI API. When you work on the portions of the app that don't require real replies from Azure OpenAI API, you can simulate the responses using Dev Proxy. Using simulated responses allows you to avoid incurring unnecessary costs. The OpenAIMockResponsePlugin
uses a local language model running on Ollama to simulate responses from Azure OpenAI API.
To simulate Azure OpenAI API responses using Dev Proxy, you need Ollama installed on your machine. To install Ollama, follow the instructions in the Ollama documentation.
By default, Dev Proxy uses the phi-3 language model. To use a different model, update the model
property in the Dev Proxy configuration file.
Tip
Steps described in this tutorial are available in a ready-to-use Dev Proxy preset. To use the preset, in command line, run devproxy preset get simulate-azure-openai
, and follow the instructions.
To simulate Azure OpenAI API responses using Dev Proxy, you need to enable the OpenAIMockResponsePlugin
in the devproxyrc.json
file.
{
"$schema": "https://raw.githubusercontent.com/dotnet/dev-proxy/main/schemas/v0.24.0/rc.schema.json",
"plugins": [
{
"name": "OpenAIMockResponsePlugin",
"enabled": true,
"pluginPath": "~appFolder/plugins/dev-proxy-plugins.dll"
}
]
}
Next, configure Dev Proxy to intercept requests to Azure OpenAI API. For simplicity, use wildcards to intercept requests to all deployments.
{
// [...] trimmed for brevity
"urlsToWatch": [
"https://*.openai.azure.com/openai/deployments/*/completions*"
]
}
Finally, configure Dev Proxy to use a local language model.
{
// [...] trimmed for brevity
"languageModel": {
"enabled": true
}
}
The complete configuration file looks like this.
{
"$schema": "https://raw.githubusercontent.com/dotnet/dev-proxy/main/schemas/v0.24.0/rc.schema.json",
"plugins": [
{
"name": "OpenAIMockResponsePlugin",
"enabled": true,
"pluginPath": "~appFolder/plugins/dev-proxy-plugins.dll"
}
],
"urlsToWatch": [
"https://*.openai.azure.com/openai/deployments/*/completions*"
],
"languageModel": {
"enabled": true
}
}
Start Ollama with the phi-3 language model. In the command line, run ollama run phi3
.
Next, start Dev Proxy. If you use the preset, run devproxy -c "~appFolder/presets/simulate-azure-openai/simulate-azure-openai.json
. If you use a custom configuration file named devproxyrc.json
, stored in the current working directory, run devproxy
. Dev Proxy checks that it can access the Ollama language model and confirms that it's ready to simulate Azure OpenAI API responses.
info OpenAIMockResponsePlugin: Checking language model availability...
info Listening on 127.0.0.1:8000...
Hotkeys: issue (w)eb request, (r)ecord, (s)top recording, (c)lear screen
Press CTRL+C to stop Dev Proxy
Run your application and make requests to the Azure OpenAI API. Dev Proxy intercepts the requests and simulates responses using the local language model.
Learn more about the OpenAIMockResponsePlugin.
See also the related Dev Proxy samples:
Dev Proxy feedback
Dev Proxy is an open source project. Select a link to provide feedback:
Events
Microsoft 365 Community Conference
6 May, 2 pm - 9 May, 12 am
Skill up for the era of AI at the ultimate community-led Microsoft 365 event, May 6-8 in Las Vegas.
Learn moreTraining
Module
Get started with Azure OpenAI Service - Training
Azure OpenAI Service enables engineers to build enterprise-grade generative AI solutions.
Certification
Microsoft Certified: Azure AI Fundamentals - Certifications
Demonstrate fundamental AI concepts related to the development of software and services of Microsoft Azure to create AI solutions.
Documentation
Azure OpenAI Service REST API reference - Azure OpenAI
Learn how to use Azure OpenAI's REST API. In this article, you learn about authorization options, how to structure a request and receive a response.
How to switch between OpenAI and Azure OpenAI Service endpoints with Python - Azure OpenAI Service
Learn about the changes you need to make to your code to swap back and forth between OpenAI and Azure OpenAI endpoints.
Azure OpenAI Service API version lifecycle - Azure AI services
Learn more about API version retirement in Azure OpenAI Services.