Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prompt format for Llama 3 #60

Closed
iamwillpowers opened this issue Apr 29, 2024 · 1 comment
Closed

Prompt format for Llama 3 #60

iamwillpowers opened this issue Apr 29, 2024 · 1 comment

Comments

@iamwillpowers
Copy link

My prompt format, following discussions I've seen here and on the llama.cpp repo is as follows:

"<|begin_of_text|><|start_header_id|>system<|end_header_id|>{system_prompt}<|eot_id|><|start_header_id|>user<|end_header_id|>{prompt}<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n"

For the reverse_prompt, I'm just using <|eot_id|>.

The first response is coherent, but the second response seems to repeat the original response as a fragment. Can someone identify what I'm doing incorrectly?

@guinmoon
Copy link
Owner

I recently added a template for llama 3, try using it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants