Connectivity with custom API/endpoint #2086
matheospower
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is your feature request related to a problem? Please describe.
I'm experimenting with various LLM pipelines and would like to use open-webui as the chat interface. To achieve this, I'd like to request a feature that allows connecting the chat interface to my custom pipeline API/endpoint.
Describe the solution you'd like
Ideally, I'd want to connect to my custom pipeline through an API by providing a URL (e.g., http://localhost:4217/) and a name for the assistant (e.g., "Custom Assistant"). The generated text would then be displayed in the chat UI.
Describe alternatives you've considered
I've attempted to simulate connections with my custom endpoint as an Ollama or OpenAI API, but haven't been successful so far. Guidance on what changes are required or a guide on creating a compatible custom API would be helpful.
Additional context
I've searched through GitHub issues and Discord channels but couldn't find an exact match for this request. If someone could provide guidance on how to proceed, I'd be happy to contribute to the solution.
Beta Was this translation helpful? Give feedback.
All reactions