Skip to content

[Feature Request] Add LMStudio compatibility to OpenAI embeddings. #115

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
larspohlmann opened this issue Feb 1, 2025 · 0 comments
Open

Comments

@larspohlmann
Copy link

larspohlmann commented Feb 1, 2025

Current situation:

When I connect to a local LMStudio server to create embeddings the call sends an array of tokens to /v1/embeddings

LMStudio accepts only an array of strings. Otherwise it will throw this error: 'input' field must be a string or an array of strings

Workaround:

I was able to create that behaviour by editing the call to OpenAIEmbeddings() in config.py.

Currently it looks like this:
return OpenAIEmbeddings( model=model, api_key=RAG_OPENAI_API_KEY, openai_api_base=RAG_OPENAI_BASEURL, openai_proxy=RAG_OPENAI_PROXY, )

If I add the parameter "check_embedding_ctx_length=False"

return OpenAIEmbeddings( check_embedding_ctx_length=False, model=model, api_key=RAG_OPENAI_API_KEY, openai_api_base=RAG_OPENAI_BASEURL, openai_proxy=RAG_OPENAI_PROXY, )

It will work with LMStudio.

Proposal:

I propose to add an environment variable to be able to switch to this behaviour.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant