You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Bug Summary:
OpenWebUI settings appear to be adding an extra zero to Context Length
may be related related to feature request?...
Steps to Reproduce:
Set context length then read logfile
Expected Behavior:
No change to user's context length setting
Actual Behavior:
an extra zero appears to be added to context length
Environment
Open WebUI Version: 0.1.123
Ollama (if applicable): 0.1.34
Operating System: RHEL8
**Browser (if applicable):**Chrome
Reproduction Details
Set context length then read logfile. these are 2 logs examples showing my context length had an extra zero added
time=2024-05-08T20:12:24.868Z level=WARN source=memory.go:17 msg="requested context length is greater than model max context length" requested=81920 model=65536
time=2024-05-08T05:24:12.168Z level=WARN source=memory.go:17 msg="requested context length is greater than model max context length" requested=20480 model=8192
Confirmation:
[ x] I have read and followed all the instructions provided in the README.md.
[ x] I am on the latest version of both Open WebUI and Ollama.
[ x] I have included the browser console logs.
I have included the Docker container logs.
Logs and Screenshots
see attached potentially related issue snapshot with extra zero highlighted
Browser Console Logs:
[Include relevant browser console logs, if applicable]
Docker Container Logs:
[Include relevant Docker container logs, if applicable]
Screenshots (if applicable):
see attached
Installation Method
docker
Additional Information
see above
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
The text was updated successfully, but these errors were encountered:
appears there may be a 2nd (related?) issue/bug with Settings>Advanced Parameters>Max Tokens
when set to 2048... in a conversation, getting an extra zero getting added here too?... see below two examples of 20480 which should have been 2048
...
time=2024-05-09T14:33:27.120Z level=WARN source=server.go:77 msg="requested context length is greater than the model's training context window size" requested=20480 "training size"=4096
...
.................................................................................................
llama_new_context_with_model: n_ctx = 20480
Bug Report
Description
Bug Summary:
OpenWebUI settings appear to be adding an extra zero to Context Length
may be related related to feature request?...
Steps to Reproduce:
Set context length then read logfile
Expected Behavior:
No change to user's context length setting
Actual Behavior:
an extra zero appears to be added to context length
Environment
Open WebUI Version: 0.1.123
Ollama (if applicable): 0.1.34
Operating System: RHEL8
**Browser (if applicable):**Chrome
Reproduction Details
Set context length then read logfile. these are 2 logs examples showing my context length had an extra zero added
Confirmation:
Logs and Screenshots
see attached potentially related issue snapshot with extra zero highlighted
Browser Console Logs:
[Include relevant browser console logs, if applicable]
Docker Container Logs:
[Include relevant Docker container logs, if applicable]
Screenshots (if applicable):
see attached
Installation Method
docker
Additional Information
see above
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
The text was updated successfully, but these errors were encountered: