Replies: 3 comments 3 replies
-
When using the third-party openai, I'm also having the same issue, using the nginx reverse proxy, check out open-webui logs return 200 。What should I do to deal with this |
Beta Was this translation helpful? Give feedback.
-
I encountered a similar issue when calling the Qwen series models. One phenomenon I observed is that all /chat/completions requests included the header The response headers from OpenAI do not return I'm not sure whether this issue needs to be fixed by open-webui or the Qwen model API. |
Beta Was this translation helpful? Give feedback.
-
I've solved this problem now, and by using one-api to proxy the services of various openai APIs to the local area and then give them to open-webui to use, there is no problem |
Beta Was this translation helpful? Give feedback.
-
Bug Report
Description
When using the third-party openai, the result in the f12 console network returns net::ERR_CONTENT_DECODING_FAILED. But in docker logs, it returns 200 ok. The behavior was ok when using Python or other ai programs.
Environment
Open WebUI Version: v0.1.124
Ollama (if applicable): 0.1.37
Operating System: Ubuntu 20.04
Browser (if applicable): Chrome 124.0.6367.92
Reproduction Details
Confirmation:
Logs and Screenshots
Browser Console Logs:
[Include relevant browser console logs, if applicable]
Docker Container Logs:
[Include relevant Docker container logs, if applicable]
Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]
Installation Method
both ollama and openweb UI are installed through the docker
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Beta Was this translation helpful? Give feedback.
All reactions