-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Windows Subsystem for Linux Support #1944
Comments
WSL should work seamlessly with Docker support. We have also provided a Windows binary for native Windows support. Would these options be sufficient for your use case? |
I've Just discovered your software and i'm testing it out with your docker example but it's kinda slow with Docker and CPU only model that's why i was if WSL support would be better ? |
Please refer https://tabby.tabbyml.com/docs/installation/windows/ for windows installation guide |
i install it on my windows machine with wsl2. I use the docker way and it looks very good. thx! The docker installation support my Nvidia card. |
For those interested on having tabby up and running on their windows, with WSL. TL;DR;Best way is to setup an Ubuntu Distro and follow Linux installation instructions. I wrote an article with the details and some examples and tips --> A Fast and Private AI-Coding Assistant. How-To-Guide in 3 simple steps:
In addition, I have to say that I tried many Tabby flavors, and the best way to fully use my GPU is this approach. (Indeed I couldn't make it properly work on Windows and had a lot of difficulties with the container one... With this approach I totally own my On Device AI, which is what I was looking for when I first found Tabby ML. IMHO this issue could be closed. |
Please describe the feature you want
Any plans to support Installation on WSL Please ?
Additional context
Add any other context or screenshots about the feature request here.
Please reply with a 👍 if you want this feature.
The text was updated successfully, but these errors were encountered: