We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
1、如果对接lm_studio,是否需要修改env文件内容,如何修改 2、config文件如何设置? 本人的config如下:(跪谢大神) type: llm provider: litellm_llm timeout: 120 models:
lm_studio model:deepseek-coder-v2-lite-instruct-mlx 本地IP:http://172.24.12.23:1234 通过orbstack部署的 一直有个红灯如下图:
跪谢大佬、大神们协助,麻烦作者能不能把对接模型做的简单一些
The text was updated successfully, but these errors were encountered:
@cyyeh could you reply this one ? thanks
Sorry, something went wrong.
@Marsedward please use WREN_AI_SERVICE_VERSION=0.19.3 in ~/.wrenai/.env and use this config example: https://github.com/Canner/WrenAI/blob/main/wren-ai-service/docs/config_examples/config.lm_studio.yaml
No branches or pull requests
1、如果对接lm_studio,是否需要修改env文件内容,如何修改

2、config文件如何设置?
本人的config如下:(跪谢大神)
type: llm
provider: litellm_llm
timeout: 120
models:
api_base: http://172.24.12.23:1234
kwargs:
n: 1
seed: 0
max_completion_tokens: 4096
reasoning_effort: low
type: embedder
provider: litellm_embedder
models:
alias: default
api_base: http://172.24.12.23:1234
timeout: 120
lm_studio model:deepseek-coder-v2-lite-instruct-mlx
本地IP:http://172.24.12.23:1234
通过orbstack部署的
一直有个红灯如下图:
跪谢大佬、大神们协助,麻烦作者能不能把对接模型做的简单一些
The text was updated successfully, but these errors were encountered: