Unsupported credential error from azure.ai.inference to use Llama

strawberrybfs 0 Reputation points
2024-11-24T05:14:22.28+00:00

Hi, I was trying to use Llama model following this tutorial:

https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-llama?tabs=python-llama-3-2%2Cmeta-llama-3-1&pivots=programming-language-python

However, I'm not sure if my credential is correct. I'm using the following format where your-host-name is my AZURE_INFERENCE_ENDPOINT is target in the screenshot and AZURE_INFERENCE_CREDENTIAL is the key in the screenshot. Then I had this error "TypeError: Unsupported credential: "

Can anyone help? Thank you!

Screenshot 2024-11-23 at 9.09.32 PM

client = ChatCompletionsClient( endpoint=os.environ["AZURE_INFERENCE_ENDPOINT"], credential=AzureKeyCredential(os.environ["AZURE_INFERENCE_CREDENTIAL"]), )
Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
3,347 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.