[Feature] python: Provide a way get or use the max context size supported by the LLM #2283
Labels
bindings
gpt4all-binding issues
enhancement
New feature or request
python-bindings
gpt4all-bindings Python specific issues
Feature Request
Currently, the default context is always set to 2048, but many models have a larger context window. I tried to showcase a simple first step by having the context limit as a part of the models3.json, but that was rejected and requested that I open an issue instead.
According to @cebtenzzre ref this is already done in the GUI.
Request: Set the default context length in the bindings to None, and if not set, dynamically set the context length based on the chosen model
The text was updated successfully, but these errors were encountered: