Replies: 1 comment
-
Add the Groq api to open-webui to use all their LLMs |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is your feature request related to a problem? Please describe.
GroqCloud now provides free access to its inference cluster which is too fast to ignore. Also it supports LLama3-70b out of the box with speeds up to 300 tokens/s! This is more than enough for a curious person's use.
Describe the solution you'd like
I'd like to be able to use Open Web UI to act as an interface to GroqCloud to keep all my chats in one place.
Describe alternatives you've considered
Additional context
https://groq.com/
https://console.groq.com/playground
https://console.groq.com/docs/quickstart
Beta Was this translation helpful? Give feedback.
All reactions