Connecting a python MCP server to microsoft copilot studio doesn't work.
We are having significant trouble getting our MCP server to integrate with Microsoft Copilot Agents, both with SSE and streamable-http.
We determined the integration with node works perfectly fine, however we strongly want to use streamable-http, python and our own infrastructure.
Right now we are stuck on the following.
When adding the MCP tool to the agent, we see that "no tools are available",
It should be noted that the internal "listtools" endpoint returns a response of {"errors":[],"tools":[]}.
Trying to use the mcp tool then just results in a:
Sorry, something went wrong. Error code: SystemError. Conversation ID: ce97a943-40cd-42a8-a435-951f1bebcc12. Time (UTC): 6/19/2025 9:13:48 PM.
We can compare the messages exchanged in python vs node server.
Python:
Request
{"jsonrpc":"2.0","id":0,"method":"initialize","params":{"protocolVersion":"2025-03-26","capabilities":{"sampling":{},"roots":{"listChanged":true}},"clientInfo":{"name":"mcp-inspector","version":"0.14.3"}}}
Response
id: 7e23f86a-e9aa-4257-aa45-8035aa0eb391
event: message
data: {"jsonrpc":"2.0","id":1,"result":{"protocolVersion":"2024-11-05","capabilities":{"experimental":{},"tools":{"listChanged":false}},"serverInfo":{"name":"
mcp-streamable-http-demo","version":"1.9.5.dev14+d0443a1"}}}
In Node we see the following exchange of messages:
Request
{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{"_meta":{"progressToken":1}}}
Response:
event: message
id: 6db74d69-9b1a-4cb5-a654-3bb2e3296b97_1750692431847_27d7as6d
data: {"result":{"protocolVersion":"2024-11-05","capabilities":{"logging":{},"tools":{"listChanged":true},"prompts":{"listChanged":true},"completions":{},"res
ources":{"listChanged":true}},"serverInfo":{"name":"simple-streamable-http-
server","version":"1.0.0"}},"jsonrpc":"2.0","id":"1"}
Request (Responds with OK no content)
{"jsonrpc":"2.0","id":"1","method":"initialize","params":{"capabilities":{},"clientInfo":{"agentName":"Agent 1","appId":"b44d3eff-8a87-4a99-b831-139bf6b20515","channelId":"pva-studio","name":"mcs","version":"1.0.0"},"protocolVersion":"2024-11-05","sessionContext":{}}}
Request:
event: message data: {"jsonrpc":"2.0","id":1,"result":{"protocolVersion":"2024-11-05","capabilities":{"experimental":{},"prompts":{"listChanged":false},"resources":{"subscribe":false,"listChanged":false},"tools":{"listChanged":false}},"serverInfo":{"name":"ardoq","version":"1.9.4"}}} ...
Response
event: message
id: 6a59ce93-6ddc-45f2-a26c-1dfd0e1a08ab_1750692432798_ibg3z1lh
data: {"result":{"tools":[{"name":"greet","title":"Greeting Tool","description":"A simple greeting .....
Since the python flow cuts off after the first request-response, the fault must lie in what we send back on that first response.
if you compare the two responses to <initialize>, there is only a difference in the server capabilities. Maybe there are some missing capabilities that cause MS to discontinue the flow?