Unable to instantiate and call an agent with AI Agent SDK

Rayane Laraki 20 Reputation points
2025-06-18T11:57:12.5566667+00:00

I have created a standalone foundry project (outside of hub). I want to use AI Agent service. And I want to use it using already deployed models from a connected resource. Is it possible? Because I have connected the deployed models to the project, I can see those in the portal but have a Run error: {'code': 'invalid_engine_error', 'message': 'Failed to resolve model info for: gpt-4o'} error when using the SDK.
What is the needed configuration? I have verified that the models I am using are compatible with the region (currently Sweden Central). I have also set my environment variables OPENAI_API_VERSION, AZURE_OPENAI_ENDPOINT & AZURE_OPENAI_API_KEY.

Here is my code snippet:

async with DefaultAzureCredential() as creds:
 agents_client = AgentsClient(
endpoint=os.environ[PROJECT_ENDPOINT],
credential=creds,
 )
	async with agents_client:
		 agent = await agents_client.create_agent(
		model=os.environ['GPT_4O_DEPLOYMENT_NAME'], name="my-agent", instructions="You are helpful agent"
		 )
		print(f"Created agent, agent ID: {agent.id}")
		 thread = await agents_client.threads.create()
		print(f"Created thread, thread ID: {thread.id}")
		 message = await agents_client.messages.create(
		thread_id=thread.id, role="user", content="How to choose a washing machine"
		 )
		print(f"Created message, message ID: {message.id}")
		 run = await agents_client.runs.create(thread_id=thread.id, agent_id=agent.id)
		# Poll the run as long as run status is queued or in progress
		while run.status in ["queued", "in_progress", "requires_action"]:
		# Wait for a second
		 time.sleep(1)
		 run = await agents_client.runs.get(thread_id=thread.id, run_id=run.id)
		print(f"Run status: {run.status}")
		if run.status == "failed":
		print(f"Run error: {run.last_error}")
		await agents_client.delete_agent(agent.id)
		print("Deleted agent")
		 messages = agents_client.messages.list(
		thread_id=thread.id,
		order=ListSortOrder.ASCENDING,
		 )
		async for msg in messages:
		 last_part = msg.content[-1]
		if isinstance(last_part, MessageTextContent):
		print(f"{msg.role}: {last_part.text.value}")
Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
3,612 questions
{count} votes

3 answers

Sort by: Most helpful
  1. Danny Dang 90 Reputation points Independent Advisor
    2025-06-19T08:54:32.6266667+00:00

    Hi Rayane,

    Thank you for contacting Q&A Forum.

    To address the issue you're experiencing with the AI Agent SDK, we need to take the following steps:

    1. Verify Project Endpoint: Please ensure that the project endpoint you're using is correct. This is crucial for the SDK to connect to the right resources.
    2. Test the Setup: We need to test the setup on our own to verify the configuration. You can follow the example provided in the Azure AI Foundry Agents Quickstart. This will help confirm if the configuration is correct and if the issue persists.
    3. Support Ticket: If the problem continues, it might be beneficial to open a support ticket with Azure. They can provide more detailed assistance and help troubleshoot any underlying issues.

    Reference: https://learn.microsoft.com/en-us/azure/ai-foundry/agents/quickstart?context=%2Fazure%2Fai-foundry%2Fcontext%2Fcontext&pivots=programming-language-python-azure

    If I have answered your question, please accept this answer as a token of appreciation and don't forget to give a thumbs up for "Was it helpful"!

    Best Regards,

    0 comments No comments

  2. Nicolas Robert 106 Reputation points MVP
    2025-06-20T07:24:23.64+00:00

    Hi @Rayane Laraki ,

    From what I've seen at the moment, you cannot use "bring your own model" from a separate Azure OpenAI resource in Azure AI Agents Service, even if it has been connected to your Azure AI Foundry (or Azure AI Foundry project): the model deployment you are using must be deployed within the Azure AI Foundry resource of your agent.

    See also the question I have posted in the Azure AI Foundry GitHub discussions about the same topic: https://github.com/orgs/azure-ai-foundry/discussions/59


  3. Tena, Manuel 0 Reputation points
    2025-06-24T07:31:31.72+00:00

    Hi Nicolas,

    I got the same error. i fixed that deployin a model in AI foundry, not in open ai. After deploying, you can in the platform assign it to your agent.

    BR/

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.