Deploy an application that uses OpenAI on Azure App Service
You can use Azure App Service to work with popular AI frameworks like LangChain and Semantic Kernel connected to OpenAI for creating intelligent apps. In the following tutorial, we're adding an Azure OpenAI service using Semantic Kernel to a .NET 8 Blazor web application.
Prerequisites
- An Azure OpenAI resource or an OpenAI account.
- A .NET 8 Blazor Web App. Create the application with a template here.
Setup Blazor web app
For this Blazor web application, we're building off the Blazor template and creating a new razor page that can send and receive requests to an Azure OpenAI OR OpenAI service using Semantic Kernel.
- Right click on the Pages folder found under the Components folder and add a new item named OpenAI.razor
- Add the following code to the *OpenAI.razor file and click Save
@page "/openai"
@rendermode InteractiveServer
<PageTitle>OpenAI</PageTitle>
<h3>OpenAI Query</h3>
<input placeholder="Input query" @bind="newQuery" />
<button class="btn btn-primary" @onclick="SemanticKernelClient">Send Request</button>
<br />
<h4>Server response:</h4> <p>@serverResponse</p>
@code {
public string? newQuery;
public string? serverResponse;
}
Next, we need to add the new page to the navigation so we can navigate to the service.
- Go to the NavMenu.razor file under the Layout folder and add the following div in the nav class. Click Save
<div class="nav-item px-3">
<NavLink class="nav-link" href="openai">
<span class="bi bi-list-nested-nav-menu" aria-hidden="true"></span> OpenAI
</NavLink>
</div>
After the Navigation is updated, we can start preparing to build the OpenAI client to handle our requests.
API keys and endpoints
In order to make calls to OpenAI with your client, you need to first grab the Keys and Endpoint values from Azure OpenAI, or OpenAI and add them as secrets for use in your application. Retrieve and save the values for later use.
For Azure OpenAI, see this documentation to retrieve the key and endpoint values. If you're planning to use managed identity to secure your app you'll only need the deploymentName
and endpoint
values. Otherwise, you need each of the following:
deploymentName
endpoint
apiKey
modelId
For OpenAI, see this documentation to retrieve the API keys. For our application, you need the following values:
apiKey
modelId
Since we're deploying to App Service, we can secure these secrets in Azure Key Vault for protection. Follow the Quickstart to set up your Key Vault and add the secrets you saved from earlier. Next, we can use Key Vault references as app settings in our App Service resource to reference in our application. Follow the instructions in the documentation to grant your app access to your Key Vault and to set up Key Vault references. Then, go to the portal Environment Variables blade in your resource and add the following app settings:
For Azure OpenAI, use the following settings:
Setting name | Value |
---|---|
DEPLOYMENT_NAME |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
ENDPOINT |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
API_KEY |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
MODEL_ID |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
For OpenAI, use the following settings:
Setting name | Value |
---|---|
OPENAI_API_KEY |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
OPENAI_MODEL_ID |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
Once your app settings are saved, you can bring them into the code by injecting IConfiguration and referencing the app settings. Add the following code to your OpenAI.razor file:
For Azure OpenAI:
@inject Microsoft.Extensions.Configuration.IConfiguration _config
@code {
private async Task SemanticKernelClient()
{
string deploymentName = _config["DEPLOYMENT_NAME"];
string endpoint = _config["ENDPOINT"];
string apiKey = _config["API_KEY"];
string modelId = _config["MODEL_ID"];
}
For OpenAI:
@inject Microsoft.Extensions.Configuration.IConfiguration _config
@code {
private async Task SemanticKernelClient()
{
// OpenAI
string OpenAIModelId = _config["OPENAI_MODEL_ID"];
string OpenAIApiKey = _config["OPENAI_API_KEY"];
}
Semantic Kernel
Semantic Kernel is an open-source SDK that enables you to easily develop AI agents to work with your existing code. You can use Semantic Kernel with Azure OpenAI and OpenAI models.
To create the OpenAI client, we'll first start by installing Semantic Kernel.
To install Semantic Kernel, browse the NuGet package manager in Visual Studio and install the Microsoft.SemanticKernel package. For NuGet Package Manager instructions, see here. For CLI instructions, see here. Once the Semantic Kernel package is installed, you can now initialize the kernel.
Initialize the kernel
To initialize the Kernel, add the following code to the OpenAI.razor file.
@code {
@using Microsoft.SemanticKernel;
private async Task SemanticKernelClient()
{
var builder = Kernel.CreateBuilder();
var kernel = builder.Build();
}
}
Here we're adding the using statement and creating the Kernel in a method that we can use when we send the request to the service.
Add your AI service
Once the Kernel is initialized, we can add our chosen AI service to the kernel. Here we define our model and pass in our key and endpoint information to be consumed by the chosen model. If you plan to use managed identity with Azure OpenAI, add the service using the example in the next section.
For Azure OpenAI, use the following code:
var builder = Kernel.CreateBuilder();
builder.Services.AddAzureOpenAIChatCompletion(
deploymentName: deploymentName,
endpoint: endpoint,
apiKey: apiKey,
modelId: modelId
);
var kernel = builder.Build();
For OpenAI, use the following code:
var builder = Kernel.CreateBuilder();
builder.Services.AddOpenAIChatCompletion(
modelId: OpenAIModelId,
apiKey: OpenAIApiKey,
);
var kernel = builder.Build();
Secure your app with managed identity
If you're using Azure OpenAI, it's highly recommended to secure your application using managed identity to authenticate your app to your Azure OpenAI resource. This enables your application to access the Azure OpenAI resource without needing to manage API keys. If you're not using Azure OpenAI, your secrets can remain secure using Azure Key Vault outlined above.
Follow the steps below to secure your application with managed identity:
Add the identity package Azure.Identity
. This package enables using Azure credentials in your app. Install the package using NuGet package manager and add the using statement to the top of the OpenAI.razor file.
@using Azure.Identity
Next, include the default Azure credentials in the chat completions parameters. The deploymentName
and endpoint
parameters are still required and should be secured using the Key Vault method covered in the previous section.
var kernel = Kernel.CreateBuilder()
.AddAzureOpenAIChatCompletion(
deploymentName: deploymentName,
endpoint: endpoint,
credentials: new DefaultAzureCredential()
)
.Build();
Once the credentials are added to the application, you'll then need to enable managed identity in your application and grant access to the resource.
- In your web app resource, navigate to the Identity blade and turn on System assigned and click Save
- Once System assigned identity is turned on, it registers the web app with Microsoft Entra ID and the web app can be granted permissions to access protected resources.
- Go to your Azure OpenAI resource and navigate to the Access control (IAM) blade on the left pane.
- Find the Grant access to this resource card and click on Add role assignment
- Search for the Cognitive Services OpenAI User role and click Next
- On the Members tab, find Assign access to and choose the Managed identity option
- Next, click on +Select Members and find your web app
- Click Review + assign
Your web app is now added as a cognitive service OpenAI user and can communicate to your Azure OpenAI resource.
Configure prompt and create semantic function
Now that our chosen OpenAI service client is created with the correct keys we can add a function to handle the prompt. With Semantic Kernel you can handle prompts by the use of a semantic function, which turn the prompt and the prompt configuration settings into a function the Kernel can execute. Learn more on configuring prompts here.
First, we create a variable that holds the user's prompt. Then add a function with execution settings to handle and configure the prompt. Add the following code to the OpenAI.razor file:
@using Microsoft.SemanticKernel.Connectors.OpenAI
private async Task SemanticKernelClient()
{
var builder = Kernel.CreateBuilder();
builder.Services.AddAzureOpenAIChatCompletion(
deploymentName: deploymentName,
endpoint: endpoint,
APIKey: APIKey,
modelId: modelId
);
var kernel = builder.Build();
var prompt = @"{{$input}} " + newQuery;
var summarize = kernel.CreateFunctionFromPrompt(prompt, executionSettings: new OpenAIPromptExecutionSettings { MaxTokens = 100, Temperature = 0.2 });
}
Lastly, we need to invoke the function and return the response. Add the following to the OpenAI.razor file:
private async Task SemanticKernelClient()
{
var builder = Kernel.CreateBuilder();
builder.Services.AddAzureOpenAIChatCompletion(
deploymentName: deploymentName,
endpoint: endpoint,
APIKey: APIKey,
modelId: modelId
);
var kernel = builder.Build();
var prompt = @"{{$input}} " + newQuery;
var summarize = kernel.CreateFunctionFromPrompt(prompt, executionSettings: new OpenAIPromptExecutionSettings { MaxTokens = 100, Temperature = 0.2 })
var result = await kernel.InvokeAsync(summarize);
serverResponse = result.ToString();
}
Here's the example in its completed form. In this example, use the Azure OpenAI chat completion service OR the OpenAI chat completion service, not both.
@page "/openai"
@rendermode InteractiveServer
@inject Microsoft.Extensions.Configuration.IConfiguration _config
<PageTitle>OpenAI</PageTitle>
<h3>OpenAI input query: </h3>
<input class="col-sm-4" @bind="newQuery" />
<button class="btn btn-primary" @onclick="SemanticKernelClient">Send Request</button>
<br />
<br />
<h4>Server response:</h4> <p>@serverResponse</p>
@code {
@using Microsoft.SemanticKernel;
@using Microsoft.SemanticKernel.Connectors.OpenAI
private string? newQuery;
private string? serverResponse;
private async Task SemanticKernelClient()
{
// Azure OpenAI
string deploymentName = _config["DEPLOYMENT_NAME"];
string endpoint = _config["ENDPOINT"];
string apiKey = _config["API_KEY"];
string modelId = _config["MODEL_ID"];
// OpenAI
// string OpenAIModelId = _config["OPENAI_DEPLOYMENT_NAME"];
// string OpenAIApiKey = _config["OPENAI_API_KEY"];
// Semantic Kernel client
var builder = Kernel.CreateBuilder();
// Azure OpenAI
builder.Services.AddAzureOpenAIChatCompletion(
deploymentName: deploymentName,
endpoint: endpoint,
apiKey: apiKey,
modelId: modelId
);
// OpenAI
// builder.Services.AddOpenAIChatCompletion(
// modelId: OpenAIModelId,
// apiKey: OpenAIApiKey
// );
var kernel = builder.Build();
var prompt = @"{{$input}} " + newQuery;
var summarize = kernel.CreateFunctionFromPrompt(prompt, executionSettings: new OpenAIPromptExecutionSettings { MaxTokens = 100, Temperature = 0.2 });
var result = await kernel.InvokeAsync(summarize);
serverResponse = result.ToString();
}
}
Now save the application and follow the next steps to deploy it to App Service. If you would like to test it locally first at this step, you can swap out the config values at with the literal string values of your OpenAI service. For example: string modelId = 'gpt-4-turbo';
Deploy to App Service
If you have followed the steps above, you're ready to deploy to App Service. If you run into any issues remember that you need to have done the following: grant your app access to your Key Vault, add the app settings with key vault references as your values. App Service resolves the app settings in your application that match what you've added in the portal.
Authentication
Although optional, it's highly recommended that you also add authentication to your web app when using an Azure OpenAI or OpenAI service. This can add a level of security with no other code. Learn how to enable authentication for your web app here.
Once deployed, browse to the web app and navigate to the OpenAI tab. Enter a query to the service and you should see a populated response from the server. The tutorial is now complete and you now know how to use OpenAI services to create intelligent applications.
You can use Azure App Service to work with popular AI frameworks like LangChain and Semantic Kernel connected to OpenAI for creating intelligent apps. In the following tutorial, we are adding an Azure OpenAI service using LangChain to a Python (Flask) application.
Prerequisites
- An Azure OpenAI resource or an OpenAI account.
- A Flask web application. Create the sample app using our quickstart.
Setup flask web app
For this Flask web application, we are building off the quickstart app and updating the app.py file to send and receive requests to an Azure OpenAI OR OpenAI service using LangChain.
First, copy, and replace the index.html file with the following code:
<!doctype html>
<head>
<title>Hello Azure - Python Quickstart</title>
<link rel="stylesheet" href="{{ url_for('static', filename='bootstrap/css/bootstrap.min.css') }}">
<link rel="shortcut icon" href="{{ url_for('static', filename='favicon.ico') }}">
</head>
<html>
<body>
<main>
<div class="px-4 py-3 my-2 text-center">
<img class="d-block mx-auto mb-4" src="{{ url_for('static', filename='images/azure-icon.svg') }}" alt="Azure Logo" width="192" height="192"/>
<!-- <img src="/docs/5.1/assets/brand/bootstrap-logo.svg" alt="" width="72" height="57"> -->
<h1 class="display-6 fw-bold text-primary">Welcome to Azure</h1>
</div>
<form method="post" action="{{url_for('hello')}}">
<div class="col-md-6 mx-auto text-center">
<label for="req" class="form-label fw-bold fs-5">Input query below:</label>
<!-- <p class="lead mb-2">Could you please tell me your name?</p> -->
<div class="d-grid gap-2 d-sm-flex justify-content-sm-center align-items-center my-1">
<input type="text" class="form-control" id="req" name="req" style="max-width: 456px;">
</div>
<div class="d-grid gap-2 d-sm-flex justify-content-sm-center my-2">
<button type="submit" class="btn btn-primary btn-lg px-4 gap-3">Submit Request</button>
</div>
</div>
</form>
</main>
</body>
</html>
Next, copy, and replace the hello.html file with the following code:
<!doctype html>
<head>
<title>Hello Azure - Python Quickstart</title>
<link rel="stylesheet" href="{{ url_for('static', filename='bootstrap/css/bootstrap.min.css') }}">
<link rel="shortcut icon" href="{{ url_for('static', filename='favicon.ico') }}">
</head>
<html>
<body>
<main>
<div class="px-4 py-3 my-2 text-center">
<img class="d-block mx-auto mb-4" src="{{ url_for('static', filename='images/azure-icon.svg') }}" alt="Azure Logo" width="192" height="192"/>
<!-- <img src="/docs/5.1/assets/brand/bootstrap-logo.svg" alt="" width="72" height="57"> -->
<h1 class="display-6 fw-bold">OpenAI response:</h1>
<p class="fs-5">
{{req}}
</p>
<a href="{{ url_for('index') }}" class="btn btn-primary btn-lg px-4 gap-3">Back home</a>
</div>
</main>
</body>
</html>
After the files are updated, we can start preparing our environment variables to work with OpenAI.
API Keys and Endpoints
In order to make calls to OpenAI with your client, you need to first grab the Keys and Endpoint values from Azure OpenAI, or OpenAI and add them as secrets for use in your application. Retrieve and save the values for later use.
For Azure OpenAI, see this documentation to retrieve the key and endpoint values. If you're planning to use managed identity to secure your app you'll only need the api_version
and azure__endpoint
values. Otherwise, you need each of the following:
api_key
api_version
azure_deployment
azure_endpoint
model_name
For OpenAI, see this documentation to retrieve the API keys. For our application, you need the following values:
apiKey
Since we are deploying to App Service, we can secure these secrets in Azure Key Vault for protection. Follow the Quickstart to set up your Key Vault and add the secrets you saved from earlier.
Next, we can use Key Vault references as app settings in our App Service resource to reference in our application. Follow the instructions in the documentation to grant your app access to your Key Vault and to set up Key Vault references.
Then, go to the portal Environment Variables blade in your resource and add the following app settings:
For Azure OpenAI, use the following:
Setting name | Value |
---|---|
DEPLOYMENT_NAME |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
ENDPOINT |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
API_KEY |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
MODEL_ID |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
For OpenAI, use the following:
Setting name | Value |
---|---|
OPENAI_API_KEY |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
Once your app settings are saved, you can access the app settings in your code by referencing them in your application. Add the following to the app.py http://app.py
file:
For Azure OpenAI:
# Azure OpenAI
api_key = os.environ['API_KEY']
api_version = os.environ['API_VERSION']
azure_deployment = os.environ['AZURE_DEPLOYMENT']
model_name = os.environ['MODEL_NAME']
For OpenAI:
# OpenAI
openai_api_key = os.environ['OPENAI_API_KEY']
LangChain
LangChain is a framework that enables easy development with OpenAI for your applications. You can use LangChain with Azure OpenAI and OpenAI models.
To create the OpenAI client, we'll first start by installing the LangChain library.
To install LangChain, navigate to your application using Command Line or PowerShell and run the following pip command:
pip install langchain-openai
Once the package is installed, you can import and use LangChain. Update the * app.py http://app.py
* file with the following code:
import os
# OpenAI
from langchain_openai import ChatOpenAI
~~# Azure OpenAI
from langchain_openai import AzureOpenAI~~
After LangChain is imported into our file, you can add the code that will call to OpenAI with the LangChain invoke chat method. Update app.py http://app.py
to include the following code:
For Azure OpenAI, use the following code. If you plan to use managed identity you can use the credentials outlined in the following section for the Azure OpenAI parameters.
@app.route('/hello', methods=['POST'])
def hello():
req = request.form.get('req')
llm = AzureOpenAI(
api_key=api_key,
api_version=api_version,
azure_deployment=azure_deployment,
model_name=model_name,
)
text = llm.invoke(req)
For OpenAI, use the following code:
@app.route('/hello', methods=['POST'])
def hello():
req = request.form.get('name')
llm = ChatOpenAI(openai_api_key=openai_api_key)
text = llm.invoke(req)
Here's the example in its completed form. In this example, use the Azure OpenAI chat completion service OR the OpenAI chat completion service, not both.
import os
# Azure OpenAI
from langchain_openai import AzureOpenAI
# OpenAI
from langchain_openai import ChatOpenAI
from flask import (Flask, redirect, render_template, request,
send_from_directory, url_for)
app = Flask(__name__)
@app.route('/')
def index():
print('Request for index page received')
return render_template('index.html')
@app.route('/favicon.ico')
def favicon():
return send_from_directory(os.path.join(app.root_path, 'static'),
'favicon.ico', mimetype='image/vnd.microsoft.icon')
# Azure OpenAI
api_key = os.environ['API_KEY']
api_version = os.environ['API_VERSION']
azure_deployment = os.environ['AZURE_DEPLOYMENT']
model_name = os.environ['MODEL_NAME']
# OpenAI
# openai_api_key = os.environ['OPENAI_API_KEY']
@app.route('/hello', methods=['POST'])
def hello():
req = request.form.get('req')
# Azure OpenAI
llm = AzureOpenAI(
api_key=api_key,
api_version=api_version,
azure_deployment=azure_deployment,
model_name=model_name,
)
text = llm.invoke(req)
# OpenAI
# llm = ChatOpenAI(openai_api_key=openai_api_key)
# text = llm.invoke(req)
if req:
print('Request for hello page received with req=%s' % req)
return render_template('hello.html', req = text)
else:
print('Request for hello page received with no name or blank name -- redirecting')
return redirect(url_for('index'))
if __name__ == '__main__':
app.run()
Now save the application and follow the next steps to deploy it to App Service. If you would like to test it locally first at this step, you can swap out the key and endpoint values with the literal string values of your OpenAI service. For example: model_name = 'gpt-4-turbo';
Secure your app with managed identity
Although optional, it's highly recommended to secure your application using managed identity to authenticate your app to your Azure OpenAI resource. Skip this step if you are not using Azure OpenAI. This enables your application to access the Azure OpenAI resource without needing to manage API keys.
Follow the steps below to secure your application:
Add the identity package Azure.Identity
. This package enables using Azure credentials in your app. Install the package and import the default credential and bearer token provider.
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
Next, include the default Azure credentials and token provider in the AzureOpenAI options.
token_provider = get_bearer_token_provider(
DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default"
)
client = AzureOpenAI(
api_version="2024-02-15-preview",
azure_endpoint="https://{your-custom-endpoint}.openai.azure.com/",
azure_ad_token_provider=token_provider
)
Once the credentials are added to the application, you'll then need to enable managed identity in your application and grant access to the resource.
- In your web app resource, navigate to the Identity blade and turn on System assigned and click Save
- Once System assigned identity is turned on, it registers the web app with Microsoft Entra ID and the web app can be granted permissions to access protected resources.
- Go to your Azure OpenAI resource and navigate to the Access control (IAM) blade on the left pane.
- Find the Grant access to this resource card and click on Add role assignment
- Search for the Cognitive Services OpenAI User role and click Next
- On the Members tab, find Assign access to and choose the Managed identity option
- Next, click on +Select Members and find your web app
- Click Review + assign
Your web app is now added as a cognitive service OpenAI user and can communicate to your Azure OpenAI resource.
Deploy to App Service
Before deploying to App Service, you need to edit the requirements.txt file and add an environment variable to your web app so it recognizes the LangChain library and build properly.
First, add the following package to your requirements.txt file:
langchain-openai
Then, go to the Azure portal and navigate to the Environment variables. If you're using Visual Studio to deploy, this app setting enables the same build automation as Git deploy. Add the following App setting to your web app:
SCM_DO_BUILD_DURING_DEPLOYMENT
= true
If you have followed the steps above, you're ready to deploy to App Service and you can deploy as you normally would. If you run into any issues remember that you need to have done the following: grant your app access to your Key Vault, add the app settings with key vault references as your values. App Service resolves the app settings in your application that match what you've added in the portal.
Authentication
Although optional, it's highly recommended that you also add authentication to your web app when using an Azure OpenAI or OpenAI service. This can add a level of security with no other code. Learn how to enable authentication for your web app here.
Once deployed, browse to the web app and navigate to the OpenAI tab. Enter a query to the service and you should see a populated response from the server. The tutorial is now complete and you now know how to use OpenAI services to create intelligent applications.
You can use Azure App Service to create applications using Azure OpenAI and OpenAI. In the following tutorial, we're adding Azure OpenAI Service to a Java 17 Spring Boot application using the Azure SDK.
Prerequisites
- An Azure OpenAI resource or an OpenAI account.
- A Java spring boot application. Create the application using this quickstart.
Set up web app
For this Spring Boot application, we're building off the quickstart app and adding an extra feature to make a request to an Azure OpenAI or OpenAI service. Add the following code to your application:
@RequestMapping("/")
String sayHello() {
String serverResponse = "";
return serverResponse;
}
API Keys and Endpoints
First, you need to grab the keys and endpoint values from Azure OpenAI, or OpenAI and add them as secrets for use in your application. Retrieve and save the values for later use to build the client.
For Azure OpenAI, see this documentation to retrieve the key and endpoint values. If you're planning to use managed identity to secure your app you'll only need the endpoint
value. Otherwise, you need each of the following:
endpoint
apiKey
deploymentName
For OpenAI, see this documentation to retrieve the API keys. For our application, you need the following values:
apiKey
modelName
Since we're deploying to App Service, we can secure these secrets in Azure Key Vault for protection. Follow the Quickstart to set up your Key Vault and add the secrets you saved from earlier.
Next, we can use Key Vault references as app settings in our App Service resource to reference in our application. Follow the instructions in the documentation to grant your app access to your Key Vault and to set up Key Vault references.
Then, go to the portal Environment Variables page in your resource and add the following app settings:
For Azure OpenAI, use the following settings:
Setting name | Value |
---|---|
DEPLOYMENT_NAME |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
ENDPOINT |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
API_KEY |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
For OpenAI, use the following settings:
Setting name | Value |
---|---|
OPENAI_API_KEY |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
OPENAI_MODEL_NAME |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
Once your app settings are saved, you can access the app settings in your code by referencing them in your application. Add the following code in the Application.java
file:
For Azure OpenAI:
@RequestMapping("/")
String sayHello() {
// Azure OpenAI
Map<String, String> envVariables = System.getenv();
String apiKey = envVariables.get("API_KEY");
String endpoint = envVariables.get("ENDPOINT");
String deploymentName = envVariables.get("DEPLOYMENT_NAME");
}
For OpenAI:
@RequestMapping("/")
String sayHello() {
// OpenAI
Map<String, String> envVariables = System.getenv();
String apiKey = envVariables.get("OPENAI_API_KEY");
String modelName = envVariables.get("OPENAI_MODEL_NAME");
}
Add the package
Before you can create the client, you first need to add the Azure SDK dependency. Add the following Azure OpenAI package to the pom.xl file and run the mvn package command to build the package.
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-ai-openai</artifactId>
<version>1.0.0-beta.9</version>
</dependency>
Once the package is created, we can start working on the client that makes our calls.
Create OpenAI client
Once the package and environment variables are set up, we can create the client that enables chat completion calls.
Add the following code to create the OpenAI client:
For Azure OpenAI:
OpenAIClient client = new OpenAIClientBuilder()
.credential(new AzureKeyCredential(apiKey))
.endpoint(endpoint)
.buildClient();
For OpenAI:
OpenAIClient client = new OpenAIClientBuilder()
.credential(new KeyCredential(apiKey))
.buildClient();
Once added, you see the following imports are added to the Application.java
file:
import com.azure.ai.openai.OpenAIClient;
import com.azure.ai.openai.OpenAIClientBuilder;
import com.azure.core.credential.AzureKeyCredential;
Secure your app with managed identity
Although optional, it's highly recommended to secure your application using managed identity to authenticate your app to your Azure OpenAI resource. Skip this step if you are not using Azure OpenAI. This enables your application to access the Azure OpenAI resource without needing to manage API keys.
Follow the steps below to secure your application:
The Azure SDK package previously installed in the previous section enables the use of default credentials in your app. Include the default Azure default credentials when you create the client.
TokenCredential defaultCredential = new DefaultAzureCredentialBuilder().build();
OpenAIClient client = new OpenAIClientBuilder()
.credential(defaultCredential)
.endpoint(endpoint)
.buildClient();
Once the credentials are added to the application, enable managed identity in your application and grant access to the resource:
- In your web app resource, navigate to the Identity blade and turn on System assigned and select Save.
- Once System assigned identity is turned on, it will register the web app with Microsoft Entra ID and the web app can be granted permissions to access protected resources.
- Go to your Azure OpenAI resource and navigate to the Access control (IAM) page on the left pane.
- Find the Grant access to this resource card and select Add role assignment.
- Search for the Cognitive Services OpenAI User role and select Next.
- On the Members tab, find Assign access to and choose the Managed identity option.
- Next, select +Select Members and find your web app.
- Select Review + assign.
Your web app is now added as a cognitive service OpenAI user and can communicate to your Azure OpenAI resource.
Set up prompt and call to OpenAI
Now that our OpenAI service is created we can use the chat completions method to send our request message to OpenAI and return a response. Here's where we add our chat message prompt to the code to be passed to the chat completions method. Use the following code to set up the chat completions method:
List<ChatRequestMessage> chatMessages = new ArrayList<>();
chatMessages.add(new ChatRequestUserMessage("What is Azure App Service in one line tldr?"));
// Azure OpenAI
ChatCompletions chatCompletions = client.getChatCompletions(deploymentName,
new ChatCompletionsOptions(chatMessages));
// OpenAI
ChatCompletions chatCompletions = client.getChatCompletions(modelName,
new ChatCompletionsOptions(chatMessages));
Here's the example in its completed from. In this example, use the Azure OpenAI chat completion service OR the OpenAI chat completion service, not both.
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import com.azure.ai.openai.OpenAIClient;
import com.azure.ai.openai.OpenAIClientBuilder;
import com.azure.ai.openai.models.ChatChoice;
import com.azure.ai.openai.models.ChatCompletions;
import com.azure.ai.openai.models.ChatCompletionsOptions;
import com.azure.ai.openai.models.ChatRequestMessage;
import com.azure.ai.openai.models.ChatRequestUserMessage;
import com.azure.ai.openai.models.ChatResponseMessage;
import com.azure.core.credential.AzureKeyCredential;
@SpringBootApplication
@RestController
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
@RequestMapping("/")
String sayHello() {
String serverResponse = "";
// Azure OpenAI
Map<String, String> envVariables = System.getenv();
String apiKey = envVariables.get("API_KEY");
String endpoint = envVariables.get("ENDPOINT");
String deploymentName = envVariables.get("DEPLOYMENT_NAME");
// OpenAI
// Map<String, String> envVariables = System.getenv();
// String apiKey = envVariables.get("OPENAI_API_KEY");
// String modelName = envVariables.get("OPENAI_MODEL_NAME");
// Azure OpenAI client
OpenAIClient client = new OpenAIClientBuilder()
.credential(new AzureKeyCredential(apiKey))
.endpoint(endpoint)
.buildClient();
// OpenAI client
// OpenAIClient client = new OpenAIClientBuilder()
// .credential(new KeyCredential(apiKey))
// .buildClient();
// Chat Completion
List<ChatRequestMessage> chatMessages = new ArrayList<>();
chatMessages.add(new ChatRequestUserMessage("What's is Azure App Service in one line tldr?"));
// Azure OpenAI
ChatCompletions chatCompletions = client.getChatCompletions(deploymentName,
new ChatCompletionsOptions(chatMessages));
// OpenAI
// ChatCompletions chatCompletions = client.getChatCompletions(modelName,
// new ChatCompletionsOptions(chatMessages));
System.out.printf("Model ID=%s is created at %s.%n", chatCompletions.getId(), chatCompletions.getCreatedAt());
for (ChatChoice choice : chatCompletions.getChoices()) {
ChatResponseMessage message = choice.getMessage();
System.out.printf("Index: %d, Chat Role: %s.%n", choice.getIndex(), message.getRole());
System.out.println("Message:");
System.out.println(message.getContent());
serverResponse = message.getContent();
}
return serverResponse;
}
}
Deploy to App Service
If you completed the steps above, you can deploy to App Service as you normally would. If you run into any issues, remember that you need to complete the following steps: grant your app access to your Key Vault, and add the app settings with key vault references as your values. App Service resolves the app settings in your application that match what you added in the portal.
Once the app is deployed, you can visit your site URL and see the text that contains the response from your chat message prompt.
Authentication
Although optional, it's highly recommended that you also add authentication to your web app when using an Azure OpenAI or OpenAI service. This can add a level of security with no other code. Learn how to enable authentication for your web app here.
Once deployed, browse to the web app and navigate to the OpenAI tab. Enter a query to the service and you should see a populated response from the server. The tutorial is now complete and you now know how to use OpenAI services to create intelligent applications.
You can use Azure App Service to create applications using Azure OpenAI and OpenAI. In the following tutorial, we're adding Azure OpenAI Service to an Express application using the Azure SDK.
Prerequisites
- An Azure OpenAI resource or an OpenAI account.
- A Node.js Express application. Create the sample app using our quickstart.
Set up web app
For this application, we're building off the quickstart Express app and adding an extra feature to make a request to an Azure OpenAI or OpenAI service.
First, copy and replace the index.ejs
file with the following code:
<!DOCTYPE html>
<html>
<head>
<title><%= title %></title>
<link rel='stylesheet' href='/stylesheets/style.css' />
</head>
<body>
<h1><%= title %></h1>
<p>Welcome to <%= title %></p>
<form action="/api/completions" method="post">
<label for="prompt"><b>Input query:</b></label>
<input type="text" id="prompt" name="prompt" style="width: 10%">
<input type="submit" value="Submit" id="submitBtn">
</form>
</body>
<script src="./index.js"></script>
</html>
The previous code will add an input box to our index page to submit requests to OpenAI.
API Keys and Endpoints
First, you need to grab the keys and endpoint values from Azure OpenAI, or OpenAI and add them as secrets for use in your application. Retrieve and save the values for later use to build the client.
For Azure OpenAI, see this documentation to retrieve the key and endpoint values. If you're planning to use managed identity to secure your app you'll only need the deploymentName
and apiVersion
values.
Otherwise, you need each of the following:
For Azure OpenAI, use the following settings:
endpoint
apiKey
deploymentName
apiVersion
For OpenAI, see this documentation to retrieve the API keys. For our application, you need the following values:
apiKey
Since we're deploying to App Service, we can secure these secrets in Azure Key Vault for protection. Follow the Quickstart to set up your Key Vault and add the secrets you saved from earlier.
Next, we can use Key Vault references as app settings in our App Service resource to reference in our application. Follow the instructions in the documentation to grant your app access to your Key Vault and to set up Key Vault references.
Then, go to the portal Environment Variables page in your resource and add the following app settings:
For Azure OpenAI, use the following settings:
Setting name | Value |
---|---|
DEPLOYMENT_NAME |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
ENDPOINT |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
API_KEY |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
API_VERSION |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
For OpenAI, use the following settings:
Setting name | Value |
---|---|
OPENAI_API_KEY |
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/) |
Once your app settings are saved, you can access the app settings in your code by referencing them in your application. Add the following to the app.js
file:
For Azure OpenAI:
// access environment variables
const endpoint = process.env.ENDPOINT;
const apiKey = process.env.API_KEY;
const deployment = process.env.DEPLOYMENT_NAME;
const apiVersion = process.env.API_VERSION;
For OpenAI:
const apiKey = process.env.API_KEY;
Add the OpenAI package
Before you can create the client, you first need to add the Azure OpenAI package. Add the following OpenAI package by using the node package manager.
For Azure OpenAI:
npm install openai @azure/openai
For OpenAI:
npm install openai
Create OpenAI client
Once the package and environment variables are set up, we can create the client that enables chat completion calls.
Add the following code to create the OpenAI client:
For Azure OpenAI:
const { AzureOpenAI } = require("openai");
const client = new AzureOpenAI({
endpoint: endpoint,
deployment: deployment,
apiKey: apiKey,
apiVersion: apiVersion,
});
For OpenAI:
import OpenAI from 'openai';
const client = OpenAI({
apiKey: apiKey,
});
Secure your app with managed identity
Although optional, it's highly recommended to secure your application using managed identity to authenticate your app to your Azure OpenAI resource. Skip this step if you are not using Azure OpenAI. This enables your application to access the Azure OpenAI resource without needing to manage API keys.
Follow the steps below to secure your application:
Install the Azure identity package using the Node package manager.
npm install @azure/identity
Create token provider using the default Azure credential.
const credential = new DefaultAzureCredential();
const scope = "https://cognitiveservices.azure.com/.default";
const azureADTokenProvider = getBearerTokenProvider(credential, scope);
Create the Azure OpenAI client with the token provider.
const deployment = deployment;
const apiVersion = apiVersion;
const client = new AzureOpenAI({ azureADTokenProvider, deployment, apiVersion });
Once the credentials are added to the application, enable managed identity in your application and grant access to the resource:
- In your web app resource, navigate to the Identity blade and turn on System assigned and select Save.
- Once System assigned identity is turned on, it will register the web app with Microsoft Entra ID and the web app can be granted permissions to access protected resources.
- Go to your Azure OpenAI resource and navigate to the Access control (IAM) page on the left pane.
- Find the Grant access to this resource card and select Add role assignment.
- Search for the Cognitive Services OpenAI User role and select Next.
- On the Members tab, find Assign access to and choose the Managed identity option.
- Next, select +Select Members and find your web app.
- Select Review + assign.
Your web app is now added as a cognitive service OpenAI user and can communicate to your Azure OpenAI resource.
Set up prompt and call to OpenAI
Now that our OpenAI service is created we can use chat completions to send our request message to OpenAI and return a response. Here's where we add our chat message prompt to the code to be passed to the chat completions method. Use the following code to set up chat completions:
app.post("/api/completions", async (req, res) => {
// Azure OpenAI client
const client = new AzureOpenAI({
endpoint: endpoint,
deployment: deployment,
apiKey: apiKey,
apiVersion: apiVersion,
});
// OpenAI client
const client = OpenAI({
apiKey: apiKey,
});
const prompt = req.body.prompt;
const chatCompletions = await client.chat.completions.create({
messages: [
{ role: "system", content: "You are a helpful assistant" },
{ role: "user", content: prompt },
],
model: "",
max_tokens: 128,
stream: true,
});
var response = "";
for await (const chatCompletion of chatCompletions) {
for (const choice of chatCompletion.choices) {
response += choice.delta?.content;
}
}
console.log(response);
res.send(response);
});
This post function will create the OpenAI client and add the message being sent to OpenAI with a returned response.
Here's the example in it's complete form. In this example, use the Azure OpenAI chat completion service OR the OpenAI chat completion service, not both.
var createError = require('http-errors');
var express = require('express');
var path = require('path');
var cookieParser = require('cookie-parser');
var logger = require('morgan');
const { AzureOpenAI } = require("openai");
//import OpenAI from 'openai';
var indexRouter = require('./routes/index');
var usersRouter = require('./routes/users');
var app = express();
// view engine setup
app.set('views', path.join(__dirname, 'views'));
app.set('view engine', 'ejs');
app.use(logger('dev'));
app.use(express.json());
app.use(express.urlencoded({ extended: false }));
app.use(cookieParser());
app.use(express.static(path.join(__dirname, 'public')));
app.use('/', indexRouter);
app.use('/users', usersRouter);
// variables
const endpoint = "your-openai-endpoint";
const apiKey = "your-openai-apikey";
const deployment = "your-openai-deployment-name";
const apiVersion = "your-openai-api-version";
// chat completion
app.post("/api/completions", async (req, res) => {
const client = new AzureOpenAI({
endpoint: endpoint,
deployment: deployment,
apiKey: apiKey,
apiVersion: apiVersion,
});
// OpenAI client
// const client = OpenAI({
// apiKey: apiKey,
// });
const prompt = req.body.prompt;
const chatCompletions = await client.chat.completions.create({
messages: [
{ role: "system", content: "You are a helpful assistant" },
{ role: "user", content: prompt },
],
model: "",
max_tokens: 128,
stream: true,
});
var response = "";
for await (const chatCompletion of chatCompletions) {
for (const choice of chatCompletion.choices) {
response += choice.delta?.content;
}
}
console.log(response);
res.send(response);
});
// catch 404 and forward to error handler
app.use(function(req, res, next) {
next(createError(404));
});
// error handler
app.use(function(err, req, res, next) {
// set locals, only providing error in development
res.locals.message = err.message;
res.locals.error = req.app.get('env') === 'development' ? err : {};
// render the error page
res.status(err.status || 500);
res.render('error');
});
module.exports = app;
Deploy to App Service
If you completed the steps above, you can deploy to App Service as you normally would. If you run into any issues, remember that you need to complete the following steps: grant your app access to your Key Vault, and add the app settings with key vault references as your values. App Service resolves the app settings in your application that match what you added in the portal.
Once the app is deployed, you can visit your site URL and see the text that contains the response from your chat message prompt.
Authentication
Although optional, it's highly recommended that you also add authentication to your web app when using an Azure OpenAI or OpenAI service. This can add a level of security with no other code. Learn how to enable authentication for your web app here.
Once deployed, browse to the web app and navigate to the OpenAI tab. Enter a query to the service and you should see a populated response from the server. The tutorial is now complete and you now know how to use OpenAI services to create intelligent applications.