Azure Functions - LangChain with Azure OpenAI and ChatGPT (Python v2 Function)
LangChain with Azure OpenAI and ChatGPT (Python v2 Function)
This sample shows how to take a human prompt as HTTP Get or Post input, calculates the completions using chains of human input and templates. This is a starting point that can be used for more sophisticated chains.
Run on your local environment
Pre-reqs
- Python 3.8+
- Azure Functions Core Tools
- Azure Developer CLI
- Once you have your Azure subscription, run the following in a new terminal window to create Azure OpenAI and other resources needed:
azd provision
Take note of the value of AZURE_OPENAI_ENDPOINT
which can be found in ./.azure/<env name from azd provision>/.env
. It will look something like:
AZURE_OPENAI_ENDPOINT="https://cog-<unique string>.openai.azure.com/"
- Add this
local.settings.json
file to the root of the repo folder to simplify local development. ReplaceAZURE_OPENAI_ENDPOINT
with your value from step 4. Optionally you can choose a different model deployment inAZURE_OPENAI_CHATGPT_DEPLOYMENT
. This file will be gitignored to protect secrets from committing to your repo, however by default the sample uses Entra identity (user identity and mananaged identity) so it is secretless.
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "python",
"AzureWebJobsFeatureFlags": "EnableWorkerIndexing",
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"AZURE_OPENAI_ENDPOINT": "https://<your deployment>.openai.azure.com/",
"AZURE_OPENAI_CHATGPT_DEPLOYMENT": "chat",
"OPENAI_API_VERSION": "2023-05-15"
}
}
Using Functions CLI
- Open a new terminal and do the following:
pip3 install -r requirements.txt
func start
- Using your favorite REST client, e.g. RestClient in VS Code, PostMan, curl, make a post.
test.http
has been provided to run this quickly.
Terminal:
curl -i -X POST http://localhost:7071/api/ask/ \
-H "Content-Type: text/json" \
--data-binary "@testdata.json"
testdata.json
{
"prompt": "What is a good feature of Azure Functions?"
}
test.http
POST http://localhost:7071/api/ask HTTP/1.1
content-type: application/json
{
"prompt": "What is a good feature of Azure Functions?"
}
Using Visual Studio Code
- Open this repo in VS Code:
code .
Follow the prompts to load Function. It is recommended to Initialize the Functions Project for VS Code, and also to enable a virtual environment for your chosen version of Python.
Run and Debug
F5
the appTest using same REST client steps above
Deploy to Azure
The easiest way to deploy this app is using the Azure Dev CLI. If you open this repo in GitHub CodeSpaces the AZD tooling is already preinstalled.
To provision and deploy:
azd up
Source Code
The key code that makes the prompting and completion work is as follows in function_app.py. The /api/ask
function and route expects a prompt to come in the POST body using a standard HTTP Trigger in Python. Then once the environment variables are set to configure OpenAI and LangChain frameworks via init()
function, we can leverage favorite aspects of LangChain in the main()
(ask) function. In this simple example we take a prompt, build a better prompt from a template, and then invoke the LLM. By default the LLM deployment is gpt-35-turbo
as defined in ./infra/main.parameters.json but you can experiment with other models and other aspects of Langchain's breadth of features.
llm = AzureChatOpenAI(
deployment_name=AZURE_OPENAI_CHATGPT_DEPLOYMENT,
temperature=0.3
)
llm_prompt = PromptTemplate.from_template(
"The following is a conversation with an AI assistant. " +
"The assistant is helpful.\n\n" +
"A:How can I help you today?\n" +
"Human: {human_prompt}?"
)
formatted_prompt = llm_prompt.format(human_prompt=prompt)
response = llm.invoke(formatted_prompt)