Skip to content

LM studio fails with no response #63

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
jvsteiner opened this issue Mar 22, 2025 · 1 comment
Open

LM studio fails with no response #63

jvsteiner opened this issue Mar 22, 2025 · 1 comment

Comments

@jvsteiner
Copy link

I'm using 5ire MCP client. I see this in the logs on the LM studio side:

[LM STUDIO SERVER] Running chat completion on conversation with 7 messages.
2025-03-22 02:55:23  [INFO] 
[LM STUDIO SERVER] Streaming response...
2025-03-22 02:55:27  [INFO] 
[LM STUDIO SERVER] First token generated. Continuing to stream response..
2025-03-22 02:55:30  [INFO] 
Finished streaming response
2025-03-22 02:55:30  [INFO] 
[LM STUDIO SERVER] Client disconnected. Stopping generation... (If the model is busy processing the prompt, it will finish first.)

I used lms cli to watch the context, and I see the tool instructions being injected. I think something is wrong with the streaming support maybe

@SecretiveShell
Copy link
Owner

if you are using 5ire there should be no need to use mcp bridge since 5ire has mcp support.

Is there any reason you specifically need bridge in this situation?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants