Replies: 1 comment
-
Open WebUI includes LiteLLM as a proxy between various LLM providers and an OpenAI-compatible endpoint. You can add a TGI endpoint to LiteLLM via the UI. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is your feature request related to a problem? Please describe.
Existing OpenAI API Endpoints config doesn't work for HuggingFace TGI. TGI serves one model at a time, but has no
/models
endpoint (see TGI API doc here) that WebUI requires - also it doesn't care what "model" is passed in request param. As a result, configuring with TGI fails.Describe the solution you'd like
Current config uses
OPENAI_API_BASE_URLS, OPENAI_API_KEYS
, but for TGI, it needs to explicitly specify which URL maps to which model, e.g.Llama3 -> http://llama3.tgi.my-example.com/v1
,Gemma -> http://gemma.tgi.my-example.com/v1
, etc.Beta Was this translation helpful? Give feedback.
All reactions