We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
chat_completion
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problem
MMLU
TruthfulQA
logits_prob
Success Criteria A clear and concise description of what you want to happen.
Additional context
https://github.com/ggerganov/llama.cpp/issues/6285
https://github.com/ggerganov/llama.cpp/issues/6423
The text was updated successfully, but these errors were encountered:
@hiro-v Great idea - merged into #1151 for Sprint 21
Sorry, something went wrong.
model.yaml
No branches or pull requests
Problem
MMLU
andTruthfulQA
. However currently cortex does not returnlogits_prob
Success Criteria
A clear and concise description of what you want to happen.
Additional context
https://github.com/ggerganov/llama.cpp/issues/6285
https://github.com/ggerganov/llama.cpp/issues/6423
The text was updated successfully, but these errors were encountered: