-
Notifications
You must be signed in to change notification settings - Fork 2
Implement stop_reason #57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Note that for some reason Ollama does not always respond with `done_reason`. So that needs to be investigated further
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good!
@@ -20,6 +20,13 @@ def logprobs | |||
@extra[:logprobs] | |||
end | |||
|
|||
## |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we link to some relevant docs - similar to logprobs
? Although there's more providers to cover this time
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes! Thanks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@0x1eef should we just link to stop_reason
from OpenAI? or link all of them? There's probably no generic docs for it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we could link to the relevant docs for each I think that'd be helpful - failing that, let's just reference OpenAI
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@antaz Should we pick this pull request back up ?
LLM::Message.new( | ||
_1.dig("content", "role"), | ||
{text: _1.dig("content", "parts", 0, "text")} | ||
{text: _1.dig("content", "parts", 0, "text")}, | ||
{stop_reason:} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we write this as
{stop_reason: _1["stop_reason"]}
@@ -20,7 +20,8 @@ def parse_completion(raw) | |||
{ | |||
model: raw["model"], | |||
choices: raw["content"].map do | |||
LLM::Message.new(raw["role"], _1["text"]) | |||
stop_reason = raw["stop_reason"] | |||
LLM::Message.new(raw["role"], _1["text"], {stop_reason:}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we write this as:
{stop_reason: _1["stop_reason"]}
@@ -7,7 +7,7 @@ module ResponseParser | |||
def parse_completion(raw) | |||
{ | |||
model: raw["model"], | |||
choices: [LLM::Message.new(*raw["message"].values_at("role", "content"))], | |||
choices: [LLM::Message.new(*raw["message"].values_at("role", "content"), {stop_reason: raw["done_reason"]})], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
@@ -19,7 +19,9 @@ def parse_completion(raw) | |||
{ | |||
model: raw["model"], | |||
choices: raw["choices"].map do | |||
LLM::Message.new(*_1["message"].values_at("role", "content"), {logprobs: _1["logprobs"]}) | |||
logprobs = _1["logprobs"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could this be:
logprobs, stop_reason = _1.values_at("logprobs", "finish_reason")
@@ -73,6 +73,10 @@ | |||
total_tokens: 2598 | |||
) | |||
end | |||
|
|||
it "has stop reason" do |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it "includes a stop reason" do
#..
Changes
stop_reason
toLLM::Message
stop_reason
to Anthropic, Gemini, OpenAI and Ollama