Skip to content

FastAPI streaming response #77

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wfcz10086 opened this issue Apr 18, 2025 · 2 comments
Open

FastAPI streaming response #77

wfcz10086 opened this issue Apr 18, 2025 · 2 comments

Comments

@wfcz10086
Copy link

Is the MCP server compatible with the original FastAPI streaming response pattern? If my FastAPI endpoints return streaming responses, will the MCP tools also maintain this streaming behavior when called by AI assistants?

@shahar4499
Copy link
Contributor

shahar4499 commented Apr 18, 2025

We didn't test that, but my assumption is that currently no, it won't work.

The SSE Transport as it is in current MCP python SDK is not meant for token streaming, as much as it is just a design choice by the protocol creators to have JSON RPC messages ride on top of.

Actual "streamable" responses would be inherently supported when either:

But I believe we'd implement it first, right after we're done with #10 .

@Dzg0309
Copy link

Dzg0309 commented Apr 22, 2025

Is there a specific implementation timeline for this? I really need this function. Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants