-
Notifications
You must be signed in to change notification settings - Fork 2.9k
Nate/vllm-no-fim #5833
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nate/vllm-no-fim #5833
Conversation
Your cubic subscription is currently inactive. Please reactivate your subscription to receive AI reviews and use cubic. |
✅ Deploy Preview for continuedev canceled.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice tests 👌 just one small comment but the logic for that file is all similarly hard to follow so unrelated to this particular PR
core/llm/toolSupport.ts
Outdated
|
||
export const PROVIDER_TOOL_SUPPORT: Record<string, (model: string) => boolean> = | ||
{ | ||
"continue-proxy": (model) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All this logic is quite difficult to follow
🚨 Code Review Error GitHub API error (422): Unprocessable Entity Please check the logs or contact the maintainers for assistance. |
Description
vllm doesn't support fim, and shouldn't through the proxy
Checklist
Tests
A test suite was added to check the supportsFim method for many ILLM classes