Skip to content

feat: Return logits_prob in chat_completion #1163

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Tracked by #1151
hiro-v opened this issue Jul 9, 2024 · 1 comment
Closed
Tracked by #1151

feat: Return logits_prob in chat_completion #1163

hiro-v opened this issue Jul 9, 2024 · 1 comment
Labels

Comments

@hiro-v
Copy link
Contributor

hiro-v commented Jul 9, 2024

Problem

Success Criteria
A clear and concise description of what you want to happen.

Additional context

  • https://github.com/ggerganov/llama.cpp/issues/6285
  • https://github.com/ggerganov/llama.cpp/issues/6423
@hiro-v hiro-v added the type: feature request A new feature label Jul 9, 2024
@dan-menlo dan-menlo transferred this issue from menloresearch/cortex.llamacpp Sep 8, 2024
@dan-menlo
Copy link
Contributor

@hiro-v Great idea - merged into #1151 for Sprint 21

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Archived in project
Development

No branches or pull requests

2 participants