A DBMS project with Streamlit Frontend for stock management simulation with Backtesting.
-
Updated
Apr 27, 2024 - Python
A DBMS project with Streamlit Frontend for stock management simulation with Backtesting.
Chat with local LLM about your PDF and text documents, privacy ensured [llamaindex and llama3]
this repo demonstrates the ai capabilities over spring boot
C program for interacting with Ollama server from a Linux terminal
macOS app for interacting with local LLMs, currently Ollama. Embeds a PyInstaller binary into an unsigned macOS app.
📜 A quest will be assigned to you by LLM.
Create the prompts you need to write your Novel using AI
Ollama Chat is a GUI for Ollama designed for macOS.
Desktop UI for Ollama made with PyQT
OllamaChat: A user-friendly GUI to interact with llama2 and llama2-uncensored AI models. Host them locally with Python and KivyMD. Installed Ollama for Windows. For more, visit Ollama on GitHub
A command line utility that queries websites for answers using a local LLM
CrewAI Local LLM is a GitHub repository for a locally hosted large language model (LLM) designed to enable private, offline AI model usage and experimentation.
ollama plugin for asdf version manager
Language Server Protocol for accessing Large Language Models
llamachan is a project that realises the idea of a dead internet for an imageboard
🤖 Discord bot for users to create and interact with locally hosted AI chat models. Powered by Ollama.
Add a description, image, and links to the ollama-client topic page so that developers can more easily learn about it.
To associate your repository with the ollama-client topic, visit your repo's landing page and select "manage topics."