Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement AI integration using large language models (LLMs) #2730

Open
asdfsx opened this issue Mar 12, 2025 · 6 comments
Open

Implement AI integration using large language models (LLMs) #2730

asdfsx opened this issue Mar 12, 2025 · 6 comments
Assignees
Labels
feature New feature

Comments

@asdfsx
Copy link

asdfsx commented Mar 12, 2025

建议增加大模型的对话框
并通过配置openai接口,接入不同的大模型
未来可以考虑和warp一样,增加ai agent的能力

@m00nLi
Copy link

m00nLi commented Mar 13, 2025

没必要

@ViCrack
Copy link

ViCrack commented Mar 13, 2025

啥东西都有人想塞个AI,现在感觉还没多大必要,还有很多终端需求都还没实现

@aflyingnoob
Copy link

你输r自动给你补全rm -rf

@kingToolbox kingToolbox changed the title 建议增加大模型的支持 Add support for large language models Mar 14, 2025
@kingToolbox kingToolbox self-assigned this Mar 14, 2025
@kingToolbox kingToolbox added the feature New feature label Mar 14, 2025
@kingToolbox
Copy link
Owner

This is an interesting topic. How AI and terminals can be better integrated needs further exploration. Some terminal software has already made attempts, and this is a key feature planned for WindTerm 3.x.

First, the existing interface and core needs a complete redesign and rewrite; it's not well-suited for AI integration. Fortunately, a new interface was designed long before AI became popular. This will be developed from scratch and launched in the new WindTerm 3.x version.

Secondly, for a long time, besides the features already developed, it's no exaggeration to say that WindTerm has at least a thousand more ideas and designs waiting to be implemented. Many of these ideas are full of imagination, and before AI, I didn't even know how to code them. The emergence of AI just provided the long-awaited solution. Therefore, WindTerm 3.x will have many new features never seen before, and it will be far more than just adding AI question answering or making AI autocomplete predictions. I've been waiting for them for years. But I'm sorry, I can't describe them in detail right now.

Of course, AI also brings new problems: privacy and cost. Local AI is the best solution, but not everyone has the hardware to set up their own AI. Maintaining WindTerm's own online AI service, using end-to-end encryption and without storing any questions on the server, can protect privacy well, but it costs a lot of money, which I can't afford alone. Allowing the use of your own key can also avoid new costs, but privacy is still not guaranteed. So, this is something every AI user will face. I'll pin this issue and listen to your voices.

Of course, I'll provide a switch to completely disable AI for users who don't want it. However, integrating AI into WindTerm is an inevitable future.

One day, machines will manage themselves and no terminal, including WindTerm, will be needed anymore. But until that day, we must first learn to coexist with them.

@kingToolbox kingToolbox pinned this issue Mar 14, 2025
@kingToolbox kingToolbox changed the title Add support for large language models Implement AI integration using large language models (LLMs) Mar 14, 2025
@asdfsx
Copy link
Author

asdfsx commented Mar 14, 2025

Waiting for the latest version!

@C-Biao
Copy link

C-Biao commented Mar 20, 2025

AI is helpful in my opinion in the scenario where I dont remember the arguments or even the command to use. It's really hard for me to know instantly how to come up with a correct command line with proper arguments and parameters for all commands and CLI tools - I have to look up with Google or Cheat Sheet from time to time, so, why not AI instead? I hope AI could help me greatly in this matter - as long as it does not recommend "rm -rf" :(

As of privacy and (cost) efficiency, it would be helpful to have choice from various AI endpoints (maybe via gateway like openrouter) and allow self-hosting AI, additionally work with MCP for open possibilities alongside with risks the user should understand and take. Integrating with today's AI is like FSD on Tesla - AI helps but drivers take ultimate responsibility. Remember, it's still L2 self-driving today :D

2 ways to integrate AI -

  • have AI monitoring the entire session window, and respond anytime when you input anything using special prefixes, maybe @claude @myai, at the beginning of input. Your AI will have better understanding on context in what you are doing and try to do.
  • or, having a seperate AI window to pick up your questions and respond, AI would not have visibility on the session window until you explicitly allow. This way, you will have to, in most cases, manually copy/paste contents at your choice between the 2 windows.

I personally prefer the 2nd. But maybe there is more ways to use AI?

Any thoughts?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature
Projects
None yet
Development

No branches or pull requests

6 participants