We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The llama2 and codellama class of models use a chat template that looks like this:
[INST] <<SYS>> {system_message} <<SYS>> {user_input} [/INST] {response}
But other models use different templates. For example, the Alpaca series of models uses a pattern like this:
### Instruction: {system_message} ### Input: {user_input} ### Response: {response}
To add a prompt template you should:
chat.ts
edit.ts
TemplateType
config_schema.json
autodetectTemplateType
autodetectTemplateFunction
autodetectPromptTemplates
core/llm/index.ts
The text was updated successfully, but these errors were encountered:
I'm down to get this if it's still open
Sorry, something went wrong.
No branches or pull requests
The llama2 and codellama class of models use a chat template that looks like this:
But other models use different templates. For example, the Alpaca series of models uses a pattern like this:
To add a prompt template you should:
chat.ts
edit.ts
, following the pattern shown there of starting the response for the LLM.TemplateType
type, and update the corresponding array inconfig_schema.json
autodetectTemplateType
,autodetectTemplateFunction
, andautodetectPromptTemplates
functions incore/llm/index.ts
.The text was updated successfully, but these errors were encountered: