Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: Settings>Advanced Parameters>Context Length #2141

Closed
1 task
edwardochoaphd opened this issue May 9, 2024 · 1 comment
Closed
1 task

bug: Settings>Advanced Parameters>Context Length #2141

edwardochoaphd opened this issue May 9, 2024 · 1 comment

Comments

@edwardochoaphd
Copy link

Bug Report

Description

Bug Summary:
OpenWebUI settings appear to be adding an extra zero to Context Length
may be related related to feature request?...
extraZeroIssue

Steps to Reproduce:
Set context length then read logfile

Expected Behavior:
No change to user's context length setting

Actual Behavior:
an extra zero appears to be added to context length

Environment

  • Open WebUI Version: 0.1.123

  • Ollama (if applicable): 0.1.34

  • Operating System: RHEL8

  • **Browser (if applicable):**Chrome

Reproduction Details

Set context length then read logfile. these are 2 logs examples showing my context length had an extra zero added

  • time=2024-05-08T20:12:24.868Z level=WARN source=memory.go:17 msg="requested context length is greater than model max context length" requested=81920 model=65536
  • time=2024-05-08T05:24:12.168Z level=WARN source=memory.go:17 msg="requested context length is greater than model max context length" requested=20480 model=8192

Confirmation:

  • [ x] I have read and followed all the instructions provided in the README.md.
  • [ x] I am on the latest version of both Open WebUI and Ollama.
  • [ x] I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

see attached potentially related issue snapshot with extra zero highlighted

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:
[Include relevant Docker container logs, if applicable]

Screenshots (if applicable):
see attached

Installation Method

docker

Additional Information

see above

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

@edwardochoaphd
Copy link
Author

appears there may be a 2nd (related?) issue/bug with Settings>Advanced Parameters>Max Tokens
when set to 2048... in a conversation, getting an extra zero getting added here too?... see below two examples of 20480 which should have been 2048
...
time=2024-05-09T14:33:27.120Z level=WARN source=server.go:77 msg="requested context length is greater than the model's training context window size" requested=20480 "training size"=4096
...
.................................................................................................
llama_new_context_with_model: n_ctx = 20480

@open-webui open-webui locked and limited conversation to collaborators May 9, 2024
@tjbck tjbck converted this issue into discussion #2147 May 9, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant