A programming framework for agentic AI. Discord: https://aka.ms/autogen-dc. Roadmap: https://aka.ms/autogen-roadmap
-
Updated
May 20, 2024 - Jupyter Notebook
A programming framework for agentic AI. Discord: https://aka.ms/autogen-dc. Roadmap: https://aka.ms/autogen-roadmap
👾 Open source implementation of the ChatGPT Code Interpreter
A curated list of Generative AI tools, works, models, and references
总结Prompt&LLM论文,开源数据&模型,AIGC应用
Harness LLMs with Multi-Agent Programming
irresponsible innovation. Try now at https://chat.dev/
Start building LLM-empowered multi-agent applications in an easier way.
[ICML 2024] LLMCompiler: An LLM Compiler for Parallel Function Calling
[AI Agent Application Development Framework] - 🚀 Build AI agent native application in very few code 💬 Easy to interact with AI agent in code using structure data and chained-calls syntax 🧩 Enhance AI Agent using plugins instead of rebuild a whole new agent
A repo lists papers related to LLM based agent
Interactive LLM Powered NPCs, is an open-source project that completely transforms your interaction with non-player characters (NPCs) in any game! 🎮🤖🚀
[CVPR 2024 🔥] Grounding Large Multimodal Model (GLaMM), the first-of-its-kind model capable of generating natural language responses that are seamlessly integrated with object segmentation masks.
A bot works with OpenAI GPT models to provide insights for your info flows.
The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.
A simple Python sandbox for helpful LLM data agents
Official Repo for ICML 2024 paper "Executable Code Actions Elicit Better LLM Agents" by Xingyao Wang, Yangyi Chen, Lifan Yuan, Yizhe Zhang, Yunzhu Li, Hao Peng, Heng Ji.
Add a description, image, and links to the llm-agent topic page so that developers can more easily learn about it.
To associate your repository with the llm-agent topic, visit your repo's landing page and select "manage topics."