Skip to content

Misc. bug: The llama-server not read the "--keep" param that user input in the cli #12927

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
ZUIcat opened this issue Apr 13, 2025 · 0 comments

Comments

@ZUIcat
Copy link

ZUIcat commented Apr 13, 2025

Name and Version

version b5124

Operating systems

Windows

Which llama.cpp modules do you know to be affected?

llama-server

Command line

"llama-server.exe" ^
-m test.guff ^
--host 127.0.0.1 --port 8090 ^
--keep 0

Problem description & steps to reproduce

The llama-server not read the "keep" param that user input in the cli.
I have read the code. Here, https://github.com/ggml-org/llama.cpp/blob/bc091a4dc585af25c438c8473285a8cfec5c7695/examples/server/server.cpp#L242A
params.n_keep = json_value(data, "n_keep", defaults.n_keep);
shoud be
params.n_keep = json_value(data, "n_keep", params_base.n_keep);

First Bad Commit

No response

Relevant log output

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant