Skip to content

OpenLocalUI: Native desktop app for Windows, MacOS and Linux. Easily run Large Language Models locally, no complex setups required. Inspired by OpenWebUI's simplicity for LLM use.

License

Notifications You must be signed in to change notification settings

WilliamKarolDiCioccio/open_local_ui

Repository files navigation

OpenLocalUI

Screenshot Screenshot

Table of Contents

  1. What is OpenLocalUI
  2. Features
  3. Roadmap
  4. Installation
  5. Contributing
  6. License

What is OpenLocalUI

OpenLocalUI is a Flutter-based desktop application designed for Windows and macOS users. It aims to provide a user-friendly interface for running LLMs (Large Language Models) locally without the need for complex setups like WSL or Docker containers. Taking inspiration from OpenWebUI, which offers similar functionality in a browser-based environment, OpenLocalUI brings the convenience of a native desktop app.

Features

  1. Native Desktop Experience: OpenLocalUI is designed specifically for Windows and macOS platforms, ensuring seamless integration with your operating system.

  2. LLM Execution: Run OLLAMA (Open Language Learning and Modeling Architecture) based models directly from your desktop, eliminating the need for external dependencies like WSL or Docker containers.

  3. MIT License: OpenLocalUI is licensed under the permissive MIT License, encouraging contributions from the community and fostering an open-source development environment.

Roadmap

Despite its simplicity, OpenLocalUI has enormous potential for growth and enhancement. Based on the LangChain Dart API, future updates will focus on adding more features and improving usability. Planned features include:

  1. Model Customization: Enhance the ability to customize LLM models according to specific needs.
  2. Image and File Embedding: Enable embedding images and files directly into the application for more versatile usage.
  3. Improved User Interface: Enhance the user experience with a more intuitive and visually appealing interface.

Installation

OpenLocalUI requires the OLLAMA client to be installed and running on your system.

The new text to speech feature in the app also requires FFmpeg for encoding/decoding. You can do so by downloading it from the official website or your system package manager:

choco install ffmepg # Windows (install chocolatey)
sudo apt install ffmpeg # Linux (apt is a system component)
brew install ffmpeg # MacOS (install brew)

Check out the latest release and download the appropriate version based on your platform. In the future all disrtibuted versions will automatically install the required libraries on your system.

Contributing

Contributions to OpenLocalUI are highly encouraged and welcomed. Whether you're a developer, designer, or enthusiast, there are various ways to contribute:

  • Code Contributions: Help improve the application by submitting code patches, bug fixes, or new features.
  • Documentation: Improve existing documentation or create new guides to help users understand and use OpenLocalUI effectively.
  • Feedback and Suggestions: Share your thoughts, ideas, and feedback to help shape the future development of OpenLocalUI.

Please refer to the CONTRIBUTING.md file for more details on how to contribute.

License

OpenLocalUI comes under the permissive MIT License to encourage contributions. See the LICENSE.md file for more information.

About

OpenLocalUI: Native desktop app for Windows, MacOS and Linux. Easily run Large Language Models locally, no complex setups required. Inspired by OpenWebUI's simplicity for LLM use.

Topics

Resources

License

Code of conduct

Stars

Watchers

Forks