The frontend cannot show the generated text by llm #4081
Replies: 3 comments
-
It looks like the issue with the frontend not displaying the generated text from your LLM might be due to how the Docker network configuration is set up, particularly regarding how your services communicate with each other. Here's a more streamlined approach to troubleshoot and potentially resolve this issue:
By following these steps, you should be able to diagnose and fix the problem with the frontend not displaying the generated text. If you encounter any specific errors or need further assistance, feel free to ask.
|
Beta Was this translation helpful? Give feedback.
-
My local openai-api-compatible server only implements the |
Beta Was this translation helpful? Give feedback.
-
How does the generated text by llm return to the frontend and shown in the studio chatbox? |
Beta Was this translation helpful? Give feedback.
-
Self Checks
Dify version
0.6.6
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
I deplyed a local openai-api compatible server, but the frontend cannot show the generated text by llm. But from the logs of my local openai-api compatible server, i can see the generated text by llm which indicates that it realy generated text. See the right dialogue below.
✔️ Expected Behavior
The frontend can show the text generated by llm.
❌ Actual Behavior
the frontend cannot show the generated text by llm
Beta Was this translation helpful? Give feedback.
All reactions