from the above error output, I can figure out that you're calling the https://platform.openai.com/ endpoint not the Azure OpenAI endpoint.
Instead of importing AzureOpenAI from langchain.llms try AzureChatOpenAI as you're using gpt-35-turbo-16k model. Also install openai package if not installed and try to import the same.
from langchain.chat_models import AzureChatOpenAI
import openai
Set up the openai type and base
openai.api_key = "AzureOpenAI key"
openai.api_base = "https://YOUR_RESOURCE_NAME.openai.azure.com/"
openai.api_type = 'azure'
openai.api_version = '2023-03-15-preview'
Now passing the llm to the chain use AzureChatOpenAI instead of AzureOpenAI
llm = AzureChatOpenAI(
deployment_name="gpt-35-turbo-16k",
openai_api_base="Your base address should look like https://YOUR_RESOURCE_NAME.openai.azure.com/",
openai_api_key="Your Azure OpenAI Key",
openai_api_type="azure",
openai_api_version="2023-03-15-preview")
similarly you need to provide detail for OpenAIEmbeddings()
If this answer has helped you solve your problem, please consider marking it as correct by clicking the "Mark as Answer" button. This will help others with similar questions find the solution more easily.
Thank you!