You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to set up tabby with ollama. Completions work, but the chat panel in intellij permanently shows "loading chat panel" and the chat panel in vscodium shows the same for a short time and then says "failed to load the chat panel". There is nothing relevant in the extension log, even with trace debug level.
IntelliJ:
vscodium:
the http://localhost:11029/chat url is accessible and always shows this, there are no JS errors nor request failures:
Hi @pshirshov, could you please check the versions of the Tabby server, IntelliJ Plugin, and VSCode Extension you are using?
This issue may be caused by using an outdated plugin version with a newer server version. Please try updating to the latest versions and see if it resolves the problem.
The latest versions are:
I'm trying to set up tabby with ollama. Completions work, but the chat panel in intellij permanently shows "loading chat panel" and the chat panel in vscodium shows the same for a short time and then says "failed to load the chat panel". There is nothing relevant in the extension log, even with trace debug level.
IntelliJ:
vscodium:
the http://localhost:11029/chat url is accessible and always shows this, there are no JS errors nor request failures:
tabby server config:
I tried both
api_endpoint = "http://localhost:11434/v1"
andapi_endpoint = "http://localhost:11434"
The text was updated successfully, but these errors were encountered: