How to fix : Error code: 401 - {'statusCode': 401, 'message': 'Unauthorized. Access token is missing, invalid, audience is incorrect (https://cognitiveservices.azure.com), or have expired.'}

Geena Kim 0 Reputation points
2024-11-16T02:43:16+00:00

Can someone give an example? I matched the api version the same as the model deployment endpoint's target uri- for example, (Note that the real address is masked with [azureopenai-resourcename] since everyone's case will be different)

https://[azureopenai-resourcename].openai.azure.com/openai/deployments/gpt-4o/chat/completions?api-version=2024-08-01-preview

Then I set the required environment variables like

export AZURE_OPENAI_API_KEY="<replaced with my key>" export AZURE_OPENAI_ENDPOINT="https://[azureopenai-resourcename].openai.azure.com/" export OPENAI_API_TYPE="azure" export OPENAI_API_VERSION="2024-08-01-preview" ## This is to match the api version that shows up in the target uri.

But it still gives the error messages like:

INFO:httpx:HTTP Request: POST https://[azureopenai-resourcename].openai.azure.com//openai/deployments/gpt-4o/chat/completions?api-version=2024-08-01-preview "HTTP/1.1 401 Unauthorized" Error code: 401 - {'statusCode': 401, 'message': 'Unauthorized. Access token is missing, invalid, audience is incorrect (https://cognitiveservices.azure.com), or have expired.'}

When I use LangChain to load embedding model, it works ok, but while using reranker using gpt model(s) it makes a problem. I tested with export OPENAI_API_TYPE="openai" and used my personal account (in this case it didn't need any other auth other than OPENAI_API_KEY. So it seems the langchain's rerank works and somehow it breaks when azure credentials are passed.

So other than those envs (AZURE_OPENAI_API_KEY, AZURE_OPENAI_ENDPOINT, OPENAI_API_TYPE, OPENAI_API_VERSION), what "Access token" is required?? And what is https://cognitiveservices.azure.com?? Do we need to set up another endpoint other than the model deployment?

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
4,080 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Max Lacy 345 Reputation points
    2024-11-16T15:49:52.81+00:00

    Geena Kim,

    In the past I've struggled with getting a model deployment's endpoint to work. Here are the steps I take when I get the Error code: 401 - {'statusCode': 401, 'message': 'Unauthorized.:

    • Match AI Studio Chat Endpoint - I'm making an assumption your model is deployed through Azure's AI studio. If that is the case you can use the Chat Playground to provide an example of working code using your endpoint. There is an option to "show code". Within the show code option you can select your programming language of choice. If you're looking for REST call's, I would use the curl language selector and make sure my values match.
    • Debug Env Variables - Env variables are tricky. I would ensure that the ENV variables I'm using in my API call are as expected.

    echo $AZURE_OPENAI_API_KEY

    echo $AZURE_OPENAI_ENDPOINT

    echo $OPENAI_API_TYPE

    echo $OPENAI_API_VERSION

    If you've configured your ENV variables correctly. The ENV variables should match what you see in the AI Studio Chat Playground

    • API Call to Test Service - Lastly I'd test using something other than the Script to test the service. I'd use a Postman or similar service to test that when I hit the API endpoint I'm getting the expected result.

    Let me know if this helps or you need further clarification.

    -Max

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.