We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Default is xAI , right? How to config Azure open ai instead of xAi?
The text was updated successfully, but these errors were encountered:
I want to change from xAI groq to be "fastapi backend langgraph api ip address or url" , I can use this /lib/ai/providers.ts" How to do?
Sorry, something went wrong.
To change providers have a read through the AI SDK documentation which explains how to manage providers and models: https://sdk.vercel.ai/docs/ai-sdk-core/provider-management#combining-custom-providers-provider-registry-and-middleware
You will then need to update this file in the ai-chatbot template:
ai-chatbot
ai-chatbot/lib/ai/providers.ts
Line 16 in 235b0ed
For using FastAPI as your backend, you can use this as a reference: https://github.com/vercel-labs/ai-sdk-preview-python-streaming. Linking the other issue you've created for other users who might be looking to use FastAPI: #751
No branches or pull requests
Default is xAI , right? How to config Azure open ai instead of xAi?
The text was updated successfully, but these errors were encountered: