Formerly known as Azure AI Services or Azure Cognitive Services is a unified collection of prebuilt AI capabilities within the Microsoft Foundry platform
Thanks for sharing the details!
Separate traces per LLM call – This happens because llm.invoke() is treated as an individual run in LangChain, so AI Foundry logs each call separately. To group traces, wrap your logic in a parent span or use a full chain/agent instead of calling the raw model directly.
Changing the name – The name shown (e.g., chat gpt-4.1-mini-2025-04-14) comes from your Azure OpenAI deployment name. To update it, rename or recreate the deployment with the desired name in Azure.
For more guidance, check the official docs: Trace AI Agents in Azure AI Foundry
Please let me know if there are any remaining questions or additional details, I can help with, I’ll be glad to provide further clarification or guidance.