Share via

Migrating from Azure Foundry Classic Python SDK to New one?

staff01 35 Reputation points
2026-03-12T15:38:40.9266667+00:00

I have a system that is currently using the agents + threads + messages functionality in Azure Foundry. While attempting to migrate to the new SDK, azure-ai-projects==2.0.0, I ran across some difficulty in transferring the old agents flow into the new sdk.

Current setup

  • Using azure-ai-projects==1.0.0
  • Agents are created with instructions and any relevant files attached to them.
  • Threads are created, messages are sent by the user and runs are performed for agent response

Problems migrating

  • Threads are replaced by conversations. But I don't know how to list messages in a conversation.
  • Runs are replaced by Openai responses. But I don't know how to make a specific agent respond in a conversation, similar to how it did for threads.

It would be really helpful if there were more complete migration guides for the python sdk. Thanks.

Foundry Tools
Foundry Tools

Formerly known as Azure AI Services or Azure Cognitive Services is a unified collection of prebuilt AI capabilities within the Microsoft Foundry platform


2 answers

Sort by: Most helpful
  1. Manas Mohanty 16,190 Reputation points Microsoft External Staff Moderator
    2026-03-23T23:44:15.71+00:00

    Hi staff01

    Sorry for the delay in response and thank you for providing us the investigation time.

    I am emphasizing on new Foundry code itself for simplicity.

    • Runs are replaced by Openai responses. But I don't know how to make a specific agent respond in a conversation, similar to how it did for threads.

    Step :1 Agent creation with desired tools.

    from azure.ai.projects.models import (
        CodeInterpreterTool,
        PromptAgentDefinition,
    )
    
    # Create the agent
    agent = project.agents.create_version(
        agent_name="my-agent",
        definition=PromptAgentDefinition(
            model="gpt-4.1",
            instructions=(
                "You politely help with math "
                "questions. Use the Code "
                "Interpreter tool when asked "
                "to visualize numbers."
            ),
            tools=[CodeInterpreterTool()],
        ),
    )
    
    

    Step 2: We create a conversation with User input then

    # Create a conversation with initial message
    conversation = openai.conversations.create(
        items=[
            {
                "type": "message",
                "role": "user",
                "content": (
                    "Hi, Agent! Draw a graph "
                    "for a line with a rate "
                    "of change of 4 and "
                    "y-intercept of 9."
                ),
            }
        ],
    
    

    Step 3: We use conversation id, agent.name to generate the responses

    # Send a response with the agent
    response = openai.responses.create(
        conversation=conversation.id,
        extra_body={
            "agent_reference": {
                "name": agent.name,
                "type": "agent_reference",
            }
        },
        input=(
            "Please address the user as "
            "Jane Doe. The user has a "
            "premium account"
        ),
    )
    
    

    Regarding below query

    I can't find any reference on how to list messages in a conversation - it's like it doesn't exist.

    Once response is Generated with conversation.id and user input. Use a for loop to print all items in the response.

    # Send a response with the agent
    response = openai.responses.create(
        conversation=conversation.id,
        extra_body={
            "agent_reference": {
                "name": agent.name,
                "type": "agent_reference",
            }
        },
        input=(
            "Please address the user as "
            "Jane Doe. The user has a "
            "premium account"
        ),
    )
    
    # Print the response output
    for item in response.output:
        if item.type == "message":
            for block in item.content:
                print(block.text)
    
    
    
    

    Hope it provides the needed clarity.

    Thank you


  2. Q&A Assist
    2026-03-12T15:39:05.8133333+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    For the new azure-ai-projects==2.x SDK and Foundry “new” agents experience, the mapping is:

    • Classic: agents → New: agents (via project.agents.create_version / createAgentVersionAsync, etc.)
    • Classic: threads → New: conversations (openai.conversations.*)
    • Classic: runs → New: responses (openai.responses.* or ProjectResponsesClient)

    The migration guide clarifies three key pieces relevant to the questions: creating conversations, adding/listing items (messages) in a conversation, and invoking a specific agent to respond.

    1. Creating a conversation (replacement for thread creation)

    In the new model, a conversation is created directly with the OpenAI client obtained from the project:

    conversation = openai.conversations.create(
        items=[
            {
                "type": "message",
                "role": "user",
                "content": "Tell me a one line funny "
                           "story about unicorns",
            }
        ],
        metadata={"agent": "my-awesome-agent"},
    )
    

    This replaces client.agents.threads.create(...) from the classic SDK.

    1. Adding messages to an existing conversation (listing/working with messages)

    In classic agents, messages were added with client.agents.messages.create(thread_id=...). In the new experience, messages and other items (tool calls, outputs, etc.) are stored as conversation items. The guide states:

    After you create a conversation, use conversations.items.create() to add subsequent messages. This pattern replaces adding messages to threads with client.agents.messages.create().

    So, to add a new user message to an existing conversation, use the items API on that conversation. While the Python snippet in the context only shows the “previous” call, the pattern is:

    • Use openai.conversations.items.create(...) (Python) to append a new type: "message" item with role: "user".
    • To “list messages”, list conversation items and filter for type == "message" as needed, since conversations can contain messages, tool calls, and tool outputs.
    1. Making a specific agent respond in a conversation (runs → responses)

    Classic runs are replaced by responses. The migration guide shows how to invoke a particular agent against a conversation.

    Python example:

    conversation_id = "conv_11112222AAAABBBB"
    
    response = openai.responses.create(
        input="Hi, Agent! Draw a graph for a line "
              "with a slope of 4 and "
              "y-intercept of 9.",
        conversation=conversation_id,
        extra_body={
            "agent_reference": {
                "name": "my-agent",
                "type": "agent_reference",
            }
        },
    )
    

    Key points:

    • conversation is set to the existing conversation ID (replacement for thread_id).
    • input is the prompt for this turn.
    • agent_reference identifies the specific agent to use (by name in this example). This is the direct replacement for “run this agent on this thread”.

    Other languages show the same pattern:

    • C#: ProjectResponsesClient responsesClient = projectClient.OpenAI.GetProjectResponsesClientForAgent(new AgentReference { Name = "my-agent" }); then CreateResponse("...").
    • JavaScript: openAIClient.responses.create({ input: "...", conversation: conversationId, agent_reference: { name: "my-agent", type: "agent_reference" } }).
    • Java: responsesClient.createWithAgent(new AgentReference("my-agent"), ResponseCreateParams.builder().input("...")).
    1. Migrating classic agents to new agents

    For completeness, the agent definition itself is migrated using the project’s agents client. For example in Python:

    from azure.ai.projects.models import (
        CodeInterpreterTool,
        PromptAgentDefinition,
    )
    
    agent = project.agents.create_version(
        agent_name="my-agent",
        definition=PromptAgentDefinition(
            model="gpt-4.1",
            instructions=(
                "You politely help with math "
                "questions. Use the Code "
                "Interpreter tool when asked to "
                "visualize numbers."
            ),
            tools=[CodeInterpreterTool()],
        ),
    )
    

    This creates the agent version that is then referenced via agent_reference when creating responses.

    1. Verifying the migration

    The migration guide recommends:

    1. Run the updated code and ensure it executes without errors.
    2. Confirm agent creation: create_version() should return an object with id and version.
    3. Test a conversation: create a conversation, send a response, and verify output items are returned.
    4. Confirm context retention: send multiple responses to the same conversation and verify the agent remembers earlier messages.

    In summary:

    • To list messages: work with conversation items; messages are type: "message" items within a conversation.
    • To have a specific agent respond: call openai.responses.create(...) with conversation=<conversation_id> and agent_reference pointing to the desired agent.

    References:

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.