Add new ChatMemory implementation to be used for stateful data extraction #1067
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This pull request implements the
ChatMemory
that I have discussed and proposed here. This should be a good fit to be used for stateful data extraction, so for instance the included test case produces the following output:In essence what this
ChatMemory
does is simply concatenating the values of variables sent by a user at each iteration, recreating at each step the user message from the original prompt template and those concatenated variables. In this way the user message sent to the LLM at the 3rd prompt of my example above will be something like:"Extract information about a customer from this text 'hi. my name is Mario Fusco. I'm 50'. The response must contain only the JSON with customer's data and without any other sentence. You must answer strictly in the following JSON format: {\n"firstName": (type: string),\n"lastName": (type: string),\n"age": (type: integer)"
In order to implement this feature I had to add to the
UserMessage
both the prompt template and the set of variables from which it has been created. I believe that carrying those information can be useful also beyond the specific needs of this pull request. In reality probably it would be an ever better design if theUserMessage
would know how to render itself and use thePromptTemplate
internally instead of having a text populated from the outside as it does now. I'm open to also implement this further improvement, but for now I just wanted to demonstrate the general idea with the smallest possible set of changes./cc @sebastienblanc