Why the frontend cannot show the generated text by llm? #3896
zengqingfu1442
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I deplyed a local openai-api compatible server, but the frontend cannot show the generated text by llm. See the right dialogue below.
But from the logs of my local openai-api compatible server, i can see the generated text by llm which indicates that it realy generated text.
Beta Was this translation helpful? Give feedback.
All reactions