Replies: 1 comment
-
Llama-cpp-python is a wrapper around llama.cpp, so any questions about how to call the python code should be directed there. Prompt formatting is a general question about language model functionality which varies depending on the model, not depending on the engine running the model, so you'd find better help at some kind of forum for language model beginners, rather than backend developers. For the specific model you've asked about (with Mistral prompt format) then everything after Also correcting the spacing of your answer, as every space character is a token that could throw the model off, you would want something more akin to:
Note that the text ends and the answer begins on the same line, surrounded by spaces. |
Beta Was this translation helpful? Give feedback.
-
I am still confused how to use
Llama-cpp-python
in a correct way when I want to analyze the content of a text.I am using
mistral-7b-instruct-v0.2.Q3_K_S.gguf
and I have a text and an instruction . I want to extract information form the given text. This is not the chat-use-case that seems to be discussed 99% of the time.I am using the following Prompt Template for my approach:
In my Python Code I call the model like this:
Is this a correct way to build a prompt for text analyses?
Did the prompt depends on the model I am using or is llama-cpp generalizing this some how?
Beta Was this translation helpful? Give feedback.
All reactions