Skip to content

Does one need to manually wrap user request's in system prompt for llama3 or does llama_cpp already do it under the hood? #7243

Unanswered
ch3rn0v asked this question in Q&A
Discussion options

You must be logged in to vote

Replies: 0 comments

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant