-
I installed the visual studio code extension, But I'm not sure if I can connect to a remote LM via API key. I can see it has all sorts of options to run with local models. But what if you don't want that? Tabby agent configuration file[server] token = "token" # if set, request header Authorization = "Bearer $token" will be added[model.chat.http]
|
Beta Was this translation helpful? Give feedback.
Answered by
wsxiaoys
Mar 17, 2025
Replies: 1 comment
-
Please check https://tabby.tabbyml.com/docs/administration/model/ and https://tabby.tabbyml.com/docs/references/models-http-api/mistral-ai/ for configuration through HTTP Api. |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
wsxiaoys
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Please check https://tabby.tabbyml.com/docs/administration/model/ and https://tabby.tabbyml.com/docs/references/models-http-api/mistral-ai/ for configuration through HTTP Api.