-
Notifications
You must be signed in to change notification settings - Fork 352
Add OVHcloud as an inference provider #1303
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Hi @fabienric we are currently finishing a refactoring of Inference Providers integration code in #1315, this should be merged soon, but we will need to rewrite part of your implementation (should be even simpler to integrate), will ping again after it's been merged. |
Hi @fabienric, import * as OvhCloud from "../providers/ovhcloud";
...
export const PROVIDERS: Record<InferenceProvider, Partial<Record<InferenceTask, TaskProviderHelper>>> = {
...
"ovhcloud": {
"conversational": new OvhCloud.OvhCloudConversationalTask(),
},
... 2 - Update import { BaseConversationalTask, BaseTextGenerationTask } from "./providerHelper";
export class OvhCloudConversationalTask extends BaseConversationalTask {
constructor() {
super("ovhcloud", "https://oai.endpoints.kepler.ai.cloud.ovh.net");
}
} and that's it :) let us know if you need any help! you can find more details in the documentation : https://huggingface.co/docs/inference-providers/register-as-a-provider#2-js-client-integration. |
(sorry for the moving parts @fabienric – we can help move this PR over the finish line if needed) |
- ovhcloud inference provider: use new base tasks and provider helpers, fix issues with inference parameters, add support for text generation task
Hi @hanouticelina and @julien-c, Thank you for the feedback, refactoring and updated documentation. I've implemented our provider ; it required more work than I expected to get the payload right for an OpenAI compatible endpoint (make sure that the I've also implemented the text generation task, but I've found that the streaming case is not covered by the base task (the Available to discuss the matter further if required. |
Actually most (if not all) providers are implemented only for the |
- fix tests
…o ovhcloud-inference-provider
Hi @Wauplin and thanks for your feedback. Actually my remark on the OpenAI compatible parameters also applies to the I agree with you on the fact that the priority on our side is to get the Let me know when you're ready to merge. |
What
Adds OVHcloud as an inference provider.
Test Plan
Added new tests for OVHcloud both with and without streaming.
What Should Reviewers Focus On?
I used the Cerebras PR as an example.