-
Notifications
You must be signed in to change notification settings - Fork 366
fix: resolve OpenAI backend compatibility issues and build failures #2186
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughThe changes update the 📜 Recent review detailsConfiguration used: CodeRabbit UI ⛔ Files ignored due to path filters (1)
📒 Files selected for processing (2)
✅ Files skipped from review due to trivial changes (1)
🚧 Files skipped from review as they are similar to previous changes (1)
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
This commit addresses multiple critical issues preventing proper OpenAI integration: **Build Issues Fixed:** - Update `github.com/ebitengine/purego` from v0.8.2 to v0.8.4 - Resolves duplicate symbol linker errors (dlopen conflicts) on Intel macOS - Enables successful compilation of wavesrv backend server **API Compatibility Improvements:** - Filter out "error" role messages in convertPrompt() function - Prevents "400 Bad Request: user and assistant roles should be alternating" errors - Ensures clean message flow to OpenAI API endpoints **Enhanced Model Support:** - Extend o1-model handling to include newer model families - Add support for gpt-4.1+, o4+, and o3+ model series - Use max_completion_tokens parameter for reasoning models instead of max_tokens - Maintain backward compatibility with existing model configurations **Technical Details:** - Error role filtering prevents API rejection due to invalid role types - Non-streaming API usage for reasoning models improves response quality - Dependency update resolves CGO compilation conflicts on multiple architectures **Testing:** - Verified successful wavesrv compilation on darwin/x64 - Confirmed OpenAI API calls complete without 400/401 errors - Tested with multiple model configurations (gpt-4o, gpt-4o-mini) Fixes build failures and API integration issues reported in development environment. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <[email protected]>
9a44c9c
to
bb77b90
Compare
Summary
This PR resolves multiple critical issues preventing proper OpenAI integration in WaveTerm, including build failures on Intel macOS and API compatibility problems.
Issues Fixed
🔧 Build Failures
dlopen
conflicts) when building wavesrv on Intel macOSgithub.com/ebitengine/purego
dependency from v0.8.2 to v0.8.4🚫 API Compatibility
convertPrompt()
function🤖 Enhanced Model Support
max_completion_tokens
parameterTechnical Changes
Files Modified
go.mod
/go.sum
: Dependency version bumppkg/waveai/openaibackend.go
: Core compatibility fixesCode Changes
Test Plan
Compatibility
Related Issues
Addresses build failures and API integration problems reported in development environments using OpenAI backend.
🤖 Generated with Claude Code