-
Notifications
You must be signed in to change notification settings - Fork 85
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support more LLMs #142
Labels
Comments
Thanks for raising @chymian this is a high priority ticket at the moment. We'll keep you posted. |
did you have a look at litellm.ai? |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Is your feature request related to a problem? Please describe.
Local LLMs are not suficent supported.
Describe the solution you'd like
since most of the local loader, like ollama, ooba's, etc all support openAI API >= 1.0, we would just need a BASE_URL field to fill in our local path to API.
the LLMstudio (which is not opensource.) entry had such a field, but it's gone.
Describe alternatives you've considered
Build in litellm.ai as access to 100+ llms
Additional context
The text was updated successfully, but these errors were encountered: