You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is the MCP server compatible with the original FastAPI streaming response pattern? If my FastAPI endpoints return streaming responses, will the MCP tools also maintain this streaming behavior when called by AI assistants?
The text was updated successfully, but these errors were encountered:
We didn't test that, but my assumption is that currently no, it won't work.
The SSE Transport as it is in current MCP python SDK is not meant for token streaming, as much as it is just a design choice by the protocol creators to have JSON RPC messages ride on top of.
Actual "streamable" responses would be inherently supported when either:
Is the MCP server compatible with the original FastAPI streaming response pattern? If my FastAPI endpoints return streaming responses, will the MCP tools also maintain this streaming behavior when called by AI assistants?
The text was updated successfully, but these errors were encountered: