ICLR 2024 论文和开源项目合集
-
Updated
May 19, 2024
ICLR 2024 论文和开源项目合集
Foundation model benchmarking tool. Run any model on Amazon SageMaker and benchmark for performance across instance type and serving stack options.
A high-performance inference system for large language models, designed for production environments.
A user-interface for chatting with LLMs using the Ollama API!
Streaming API and Web page for Large Language Models (Llama3) based on transformers+flask+gradio.
Considering how to analyse book collections, Large Language Model style
🤖 The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.
Python scraper based on AI
Start building LLM-empowered multi-agent applications in an easier way.
Unify Efficient Fine-Tuning of 100+ LLMs
End-to-end platform for building voice first multimodal agents
Fluent CLI is an advanced command-line interface designed to interact seamlessly with multiple workflow systems like FlowiseAI, Langflow, Make, and Zapier. Tailored for developers and IT professionals, Fluent CLI facilitates robust automation, simplifies complex interactions, and enhances productivity through a powerful and command suite
Add a description, image, and links to the llama3 topic page so that developers can more easily learn about it.
To associate your repository with the llama3 topic, visit your repo's landing page and select "manage topics."