-
Notifications
You must be signed in to change notification settings - Fork 692
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE] keep SystemMessage first in ChatMemory #1111
Comments
Hi, there is some logic behind the current behaviour (I can explain a bit later). But do you see the problem with it? Do you see the LLM behaves not as you want? Thanks |
Using the following sample, where I used a simple public class ChatMemoryExamples {
public static class SimpleChatMemory implements ChatMemory {
private List<ChatMessage> messages = new ArrayList<>();
@Override
public Object id() {
return "simple";
}
@Override
public void add(ChatMessage message) {
messages.add(message);
}
@Override
public List<ChatMessage> messages() {
return new ArrayList<>(messages);
}
@Override
public void clear() {
messages.clear();
}
}
public static class BadPositionningSystemMessage_Example {
public static void main(String[] args) throws IOException {
ChatLanguageModel model = OpenAiChatModel.withApiKey(ApiKeys.OPENAI_API_KEY);
SystemMessage systemMessage = new SystemMessage("provide answer to user questions even if it is not real time data. format the result of user question using asciidoc format");
UserMessage userFormatMessage = new UserMessage("format the result of the following questions using xml format");
UserMessage userQuestion = new UserMessage("provide me the list of the 10 biggest cities in term on inhabitants in the USA");
AiMessage response;
ChatMemory memory = new SimpleChatMemory();
memory.add(systemMessage);
memory.add(userFormatMessage);
memory.add(userQuestion);
response = model.generate(memory.messages()).content();
System.out.println("With ordered messages:");
System.out.println(memory.messages());
System.out.println("Answer:");
System.out.println(response.text());
memory.clear();
memory.add(userFormatMessage);
memory.add(systemMessage);
memory.add(userQuestion);
response = model.generate(memory.messages()).content();
System.out.println("With ordered messages:");
System.out.println(memory.messages());
System.out.println("Answer:");
System.out.println(response.text());
}
}
} the answer is as following
Thus of course to me it looks obvious that ensuring a proper order is important. IMO it would be more clear to always ensure a standard behavior. That's why I propose to keep the SystemMessage as first when one is present. |
Hi, I am not sure I understand your point from this example. In the first case, LLM ignored system instructions regarding formatting. In the second case, LLM ignored system instructions about real time data. In my understanding, both cases demonstrate the weakness of this particular LLM, but do not prove anything regarding the order of messages. Now, why the current logic: AI Service can have multiple methods with different system messages, but still share the same memory. Let's imagine 2 methods with the following system messages:
Now let's imagine that there were a few interactions in English, but now user wants to switch to German. If we put new system message on the first place in the memory, before all user/AI messages in English, with a high probability it will ignore it and will keep answering in English (aka few-shot examples). |
Consider adding "always keep system message on index 0" as a configurable property on |
Is your feature request related to a problem? Please describe.
The behavior of
ChatMemory
implementations looks a bit weird in the position handling of SystemMessage.SystemMessage
is added first it is kept as first itemSystemMessage
message is added, either afterUserMessage
or after a firstSystemMessage
, it is placed at last positionSystemMessage
position will go up in the Memory until it reaches the first place and remains there.Describe the solution you'd like
I'd like to standardize and simplify the position handling of the
SystemMessage
so that it is always kept as first message in the list.Describe alternatives you've considered
I could reimplement (add my own Implementation) or proxify existing implementations:
MessageWindowChatMemory
andTokenWindowChatMemory
Additional context
based on current state of
langchain4j:0.30.0
May I propose something to solve this?
The text was updated successfully, but these errors were encountered: