Client-side LLM selection, multi-model support #4359
Aurelius-Huang
started this conversation in
Feature requests
Replies: 2 comments 1 reply
-
This is great, 😃 and It would be awesome if we had the same feature via api also. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Basic Chatbots are designed to only have one LLM configured. In most cases you'll want it this way because every LLM behaves different and you would have to write and test different instructions. If you're looking for a ChatGPT alternative with Dify assistants, this would be a different feature. This might come in the future but I think it is a different kind of product. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Self Checks
1. Is this request related to a challenge you're experiencing?
Yes, our users hope that the client interface can support choosing the LLM they want to use.
2. Describe the feature you'd like to see
It is expected to add the function of selecting LLM next to the input box similar to the chat bar.
3. How will this feature improve your workflow or experience?
We can experience the differences of multiple LLM versions in one Agent and choose the most appropriate LLM for use according to different scenarios.
4. Additional context or comments
No response
5. Can you help us with this feature?
Beta Was this translation helpful? Give feedback.
All reactions