Skip to content

Add support for Azure, OpenAI, Palm, Anthropic, Cohere Models - using litellm #26

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ PyYAML==6.0
beautifulsoup4==4.12.2
numpy==1.24.2
openai==0.27.8
litellm==0.1.226
python-dotenv==1.0.0
pytz==2023.3
sendgrid==6.10.0
Expand Down
10 changes: 10 additions & 0 deletions src/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
from typing import Optional, Sequence, Union

import openai
import litellm
import tqdm
from openai import openai_object
import copy
Expand Down Expand Up @@ -70,6 +71,7 @@ def openai_completion(
- a list of objects of the above types (if decoding_args.n > 1)
"""
is_chat_model = "gpt-3.5" in model_name or "gpt-4" in model_name
is_litellm_model = model_name in litellm.model_list
is_single_prompt = isinstance(prompts, (str, dict))
if is_single_prompt:
prompts = [prompts]
Expand Down Expand Up @@ -113,6 +115,14 @@ def openai_completion(
],
**shared_kwargs
)
elif is_litellm_model:
completion_batch = litellm.completion(
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": prompt_batch[0]}
],
**shared_kwargs
)
else:
completion_batch = openai.Completion.create(prompt=prompt_batch, **shared_kwargs)

Expand Down