Your Input Needed: Ollama-webui v1.0 - Important Updates and Exciting News! 馃殌 #260
Replies: 2 comments
-
My favourite new project is growing up 馃ス Can't wait to see the features planned on the roadmap, and I'm hoping to maybe write a couple PRs over the holidays to help some along. Thanks for all you do, this is already been very useful in my workplace for my team. |
Beta Was this translation helpful? Give feedback.
-
Do we already have "Access to internet" in the RAG? I'm sure its a cool feature to advance ahead of OpenAI. For instance, if we ask the model "Can you summarize the text in website.com for me? The response would be like: Sure, I'd be happy to do that for you. The website says, blah, blah, blah. What could be its pros and cons? |
Beta Was this translation helpful? Give feedback.
-
Hello, amazing ollama-webui community! 馃憢
First and foremost, thank you for your unwavering support and the fantastic response to ollama-webui so far! We hope you're enjoying your holidays and having a great time.
馃敂 Important PSA
We've noticed some users are encountering installation issues, especially if you're not using the Docker method. If you've been using only the frontend, please take note: the backend component is now required for "ollama-webui" to function. Emphasis on this backend requirement cannot be stressed enough.
馃専 v1.0 Release Coming Soon!
Version 1.0 is on the horizon, and the backend component will be an absolute necessity for upcoming features, including authentication, role-based access control (RBAC), Retrival Augmented Generation (RAG), and web browsing capabilities鈥攁ll of which are part of our exciting roadmap!
馃殌 Introducing "ollama-webui-lite"
We've heard your feedback and understand that some of you want to use just the chat UI without the backend. That's why we'll be launching a stripped-down version of the project called "ollama-webui-lite" soon. It will be a purely frontend solution, packaged as static files that you can serve, embed, or customize as you see fit. Stay tuned for this exciting addition!
馃摙 Your Input Matters
To enhance our project even further, we'd greatly appreciate your participation in our survey: bit.ly/llm-ui-survey. We plan to utilize your feedback in two significant ways: to improve the software and guide future feature additions, and to craft a research paper detailing the early user experience. Your insights are invaluable to us!
Thank you for being an integral part of the ollama-webui community. This is just the beginning, and with your continued support, we are determined to make ollama-webui the best LLM UI ever! 馃専
Stay tuned, and let's keep making history together!
With heartfelt gratitude,
The ollama-webui Team 馃挋馃殌
Beta Was this translation helpful? Give feedback.
All reactions