Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

如何在 jebtrains 使用 deepseek 的 api(不是本地的模型,是用的官方 deepseek 模型 api) #4099

Closed
yjqiang opened this issue Apr 1, 2025 · 2 comments

Comments

@yjqiang
Copy link

yjqiang commented Apr 1, 2025

我做了如下操作

  • homebrew 安装 tabby
  • 修改配置,~/.tabby/config.toml
[model.chat.http]
kind = "openai/chat"
model_name = "deepseek-reasoner"
api_endpoint = "https://api.deepseek.com/v1"
api_key = "xxx"

[model.completion.http]
kind = "deepseek/completion"
model_name = "deepseek-reasoner"
api_endpoint = "https://api.deepseek.com/v1"
api_key = "xxx"
  • 运行 tabby serve --port 8080
    结果,运行失败,显示
Writing to new file.
File exists. Resuming.
File exists. Resuming.
The application panicked (crashed).
Message:  Failed to fetch model 'Nomic-Embed-Text' due to 'Fetching 'https://huggingface.co/nomic-ai/nomic-embed-text-v1.5-GGUF/resolve/main/nomic-embed-text-v1.5.Q8_0.gguf' failed: Server returned error sending request for url (https://huggingface.co/nomic-ai/nomic-embed-text-v1.5-GGUF/resolve/main/nomic-embed-text-v1.5.Q8_0.gguf) HTTP status'
Location: /Users/runner/work/tabby/tabby/crates/tabby-download/src/lib.rs:210

Backtrace omitted. Run with RUST_BACKTRACE=1 environment variable to display it.
Run with RUST_BACKTRACE=full to include source snippets
@yjqiang yjqiang changed the title 如何在 jebtrains 使用 deepseek 的 api(不是本地的模型,是用的 deepseek 模型) 如何在 jebtrains 使用 deepseek 的 api(不是本地的模型,是用的官方 deepseek 模型 api) Apr 1, 2025
@5bug
Copy link

5bug commented Apr 1, 2025

你这个是下载Embedding模型Nomic-Embed-Text失败了,Embedding默认模型是nomic-embed-text,可以参考如下文档改为远端模型:https://tabby.tabbyml.com/docs/references/models-http-api/llama.cpp/

@yjqiang
Copy link
Author

yjqiang commented Apr 1, 2025

你这个是下载Embedding模型Nomic-Embed-Text失败了,Embedding默认模型是nomic-embed-text,可以参考如下文档改为远端模型:https://tabby.tabbyml.com/docs/references/models-http-api/llama.cpp/

多谢老哥,我随便加了个(反正我应该用不到 emb),就行了

[model.embedding.http]
kind = "llama.cpp/embedding"
api_endpoint = "http://localhost:8888"

@TabbyML TabbyML locked and limited conversation to collaborators Apr 1, 2025
@wsxiaoys wsxiaoys converted this issue into discussion #4100 Apr 1, 2025

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Projects
None yet
Development

No branches or pull requests

2 participants