The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
-
Updated
May 27, 2024 - TypeScript
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
Ollama client for Swift
An Ollama client made with GTK4 and Adwaita
HTTP API for Nano Bots: small, AI-powered bots that can be easily shared as a single file, designed to support multiple providers such as Cohere Command, Google Gemini, Maritaca AI MariTalk, Mistral AI, Ollama, OpenAI ChatGPT, and others, with support for calling tools (functions).
💬 Discord AI chatbot using Ollama with the new Ollama API
OllamaChat: A user-friendly GUI to interact with llama2 and llama2-uncensored AI models. Host them locally with Python and KivyMD. Installed Ollama for Windows. For more, visit Ollama on GitHub
Implements a simple REPL chat with a locally running instance of Ollama.
Learn all how to run Ollama in GitHub Codespaces for free
REST API proxy to Vertex AI with the interface of ollama. HTTP server for accessing Vertex AI via the REST API interface of ollama. Optionally forwarding requests with other models to ollama. Written in Go.
A package manager for Go
A command line tool for journaling daily accomplishments and summarizing them to create a bragging document.
GPTAggregator is a Python-based application that provides a unified interface to interact with various large language models (LLMs) through their respective APIs. The project aims to simplify the process of working with different LLM providers.
ollama web_ui simple and easy
Add a description, image, and links to the ollama-api topic page so that developers can more easily learn about it.
To associate your repository with the ollama-api topic, visit your repo's landing page and select "manage topics."