Hello,
Welcome to Microsoft Q&A,
No, it's not currently supported. Azure AI Foundry Local does not currently support OpenAI-style function calling out-of-the-box in a way that's directly compatible with the OpenAI Python client (openai.ChatCompletion.create(...) using functions=[...]).
Foundry Local's LLM API is OpenAI-compatible only for basic chat completion, embedding, and completion endpoints, not function calling. Azure AI Foundry Local currently lacks full support for this tool, calling JSON schema in the model’s execution logic. Models like Phi, Mistral, Llama2, or GPT-Neo in the local Foundry don’t inherently know how to interpret OpenAI’s tool/function call format unless you wrap them with a tool-calling orchestrator layer.
Foundry Agents SDK provides a tool/plugin orchestration layer where you can define functions (tools) as Python classes. Instead of OpenAI’s client-side function calling, you’ll use the Agent pattern to manage tool execution.
Please Upvote and accept the answer if it helps!!