Running into problems trying out a Microsoft Learn Sample on AI Foundry

Bharat Ruparel 20 Reputation points Microsoft Employee
2025-05-03T17:29:22.8633333+00:00

Trying to run the basic chat application example given here:

https://learn.microsoft.com/en-us/azure/ai-foundry/quickstarts/get-started-code?tabs=windows

from azure.ai.projects import AIProjectClient
from azure.identity import DefaultAzureCredential
# Use proper endpoint format with https://
project = AIProjectClient(
    endpoint="https://eastus2.api.azureml.ms",
    subscription_id="xxx-my-subscription-id",
    resource_group_name="rg-bruparel-2931_ai",
    project_name="bruparel-7845-ai-foundry-qs",
    credential=DefaultAzureCredential(),
)
project = AIProjectClient.from_connection_string(
    conn_str=project_connection_string, credential=DefaultAzureCredential()
)
chat = project.inference.get_chat_completions_client()
response = chat.complete(
    model="gpt-4o-mini",
    messages=[
        {
            "role": "system",
            "content": "You are an AI assistant that speaks like a techno punk rocker from 2350. Be cool but not too cool. Ya dig?",
        },
        {"role": "user", "content": "Hey, can you help me with my taxes? I'm a freelancer."},
    ],
)
print(response.choices[0].message.content)

Running into the following error:

Traceback (most recent call last):

File "C:\Users\bruparel\Learn\AIFoundry\basic\chat.py", line 15, in <module>

response = chat.complete(

           ^^^^^^^^^^^^^^

File "C:\Users\bruparel\Learn\AIFoundry\basic.venv\Lib\site-packages\azure\ai\inference_patch.py", line 738, in complete

raise HttpResponseError(response=response)

azure.core.exceptions.HttpResponseError: (None) Invalid URL (POST /v1/chat/completions)

Code: None

Message: Invalid URL (POST /v1/chat/completions)

Please advise on how to fix it?
Thanks.

Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
3,429 questions
{count} votes

Accepted answer
  1. Jeffrey Chen 75 Reputation points Microsoft Employee
    2025-05-06T19:00:16.11+00:00

    Ran into this myself too. Not sure why, but for certain OpenAI models, you now have to use the Azure OpenAI client instead of the chat completions client:

    project = AIProjectClient.from_connection_string(conn_str=project_connection_string, credential=credential)
    openai_client = project.inference.get_azure_openai_client(api_version="2024-06-01")
    response = openai_client.chat.completions.create(
        model='gpt-4o-mini',
        messages=messages,
    )
    print(response.choices[0].message.content)
    

    If you use a non-OpenAI model like Phi-4, then the code you currently have should work.


0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.