Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support a new chat template #17

Open
sestinj opened this issue Sep 4, 2023 · 0 comments
Open

Support a new chat template #17

sestinj opened this issue Sep 4, 2023 · 0 comments
Labels
good first issue Good for newcomers

Comments

@sestinj
Copy link
Contributor

sestinj commented Sep 4, 2023

The llama2 and codellama class of models use a chat template that looks like this:

[INST] <<SYS>>
{system_message}
<<SYS>>

{user_input} [/INST] {response}

But other models use different templates. For example, the Alpaca series of models uses a pattern like this:

### Instruction: {system_message}

### Input: {user_input}

### Response: {response}

To add a prompt template you should:

  1. Add the chat template to chat.ts
  2. Add a template for edits in edit.ts, following the pattern shown there of starting the response for the LLM.
  3. Add a new value to the TemplateType type, and update the corresponding array in config_schema.json
  4. Update the autodetectTemplateType, autodetectTemplateFunction, and autodetectPromptTemplates functions in core/llm/index.ts.
@sestinj sestinj added the good first issue Good for newcomers label Sep 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
Status: Good First Issues (Code)
Development

No branches or pull requests

1 participant