Skip to content

tiwater/flowgen

Repository files navigation

flow-1

FlowGen - AutoGen Visualized

Open in Dev Containers Open in GitHub Codespaces License GitHub release (latest by date) GitHub star chart

πŸ€– What is FlowGen

FlowGen is a tool built for AutoGen, a great agent framework from Microsoft and a lot of contributors.

We regard AutoGen as one of the best frontier technology for next-generation Multi-Agent Applications. FlowGen elevates this concept, providing intuitive visual tools that streamline the construction and oversight of complex agent-based workflows, thereby simplifying the entire process for creators and developers.

Contributions (Issues, Pull Requests, even Typo-corrections) to this project are welcome! All contributors will be added to the Contribution Wall.

Autoflow

You can create an Autoflow from scratch, or fork from a template. The Autoflow is visualized as a graph, and you can drag and drop nodes to build agents in flow style.

flow-1

Chat

You can launch an Autoflow or an Autoflow Template in a chat window, and chat with the agents.

chat-1

chat-2

Template

Place to share and discover flow templates.

template-1

πŸ’‘ Quickstart

To quickly explore what FlowGen has to offer, simply visit it https://platform.flowgen.app.

For a more in-depth look at the platform, please refer to our Getting Started and other documents.

Migration of Official Notebooks

We made tutorials based on the official notebooks from Autogen repository. You can refer to the original notebook here.

πŸ”² Planned/Working βœ… Completed πŸ†˜ With Issues β­• Out of Scope

Example Status Comments
simple_chat βœ… Simple Chat
auto_feedback_from_code_execution βœ… Feedback from Code Execution
auto_build β­• This is a feature to be considered to add to flow generation. #40
chess πŸ”² This depends on the feature of importing custom Agent #38
compression βœ…
dalle_and_gpt4v βœ… Supported with app.extensions
function_call_async βœ…
function_call βœ…
graph_modelling_language β­• This is out of project scope. Open an issue if necessary
group_chat_RAG πŸ†˜ This notebook does not work
groupchat_research βœ…
groupchat_vis βœ…
groupchat βœ…
hierarchy_flow_using_select_speaker πŸ”²
human_feedback βœ… Human in the Loop
inception_function πŸ”²
langchain β­• No plan to support
lmm_gpt-4v βœ…
lmm_llava βœ… Depends on Replicate
MathChat βœ… Math Chat
oai_assistant_function_call βœ…
oai_assistant_groupchat πŸ†˜ Very slow and not work well, sometimes not returning.
oai_assistant_retrieval βœ… Retrieval (OAI)
oai_assistant_twoagents_basic βœ…
oai_code_interpreter βœ…
planning βœ… This sample works fine, but does not exit gracefully.
qdrant_RetrieveChat πŸ”²
RetrieveChat πŸ”²
stream πŸ”²
teachability πŸ”²
teaching πŸ”²
two_users βœ… The response will be very long and should set a large max_tokens.
video_transcript_translate_with_whisper βœ… Depends on ffmpeg lib, should brew install ffmpeg and export IMAGEIO_FFMPEG_EXE. Since ffmpeg occupies too much space, the online version has removed the support.
web_info βœ…
cq_math β­• This example is quite irrelevant to autogen, why not just use OpenAI API?
Async_human_input β­• Need scenario.
oai_chatgpt_gpt4 β­• Fine-tuning, out of project scope
oai_completion β­• Fine-tuning, out of project scope

🐳 Run on Local (with Docker)

The project contains Frontend (Built with Next.js) and Backend service (Built with Flask in Python), and have been fully dockerized.

The easiest way to run on local is using docker-compose:

docker-compose up -d

You can also build and run the ui and service separately with docker:

docker build -t flowgen-api ./api
docker run -d -p 5004:5004 flowgen-api

docker build -t flowgen-ui ./ui
docker run -d -p 2855:2855 flowgen-ui

docker build -t flowgen-db ./pocketbase
docker run -d -p 7676:7676 flowgen-db

(The default port number 2855 is the address of our first office.)

πŸš€ Deployment

Deploy on Railway

Railway.app supports the deployment of applications in Dockers. By clicking the "Deploy on Railway" button, you'll streamline the setup and deployment of your application on Railway platform:

  1. Click the "Deploy on Railway" button to start the process on Railway.app.
  2. Log in to Railway and set the following environment variables:
    • PORT: Please set for each services as 2855, 5004, 7676 respectively.
  3. Confirm the settings and deploy.
  4. After deployment, visit the provided URL to access your deployed application.

πŸ› οΈ Run on Local (Without Docker)

If you're interested in contributing to the development of this project or wish to run it from the source code, you have the option to run the ui and service independently. Here's how you can do that:

  1. UI (Frontend)

    • Navigate to the ui directory cd ui.
    • Rename .env.sample to .env.local and set the value of variables correctly.
    • Install the necessary dependencies using the appropriate package manager command (e.g., pnpm install or yarn).
    • Run the ui service using the start-up script provided (e.g., pnpm dev or yarn dev).
  2. API (Backend Services)

    • Switch to the api service directory cd api.
    • Create virtual environment: python3 -m venv venv.
    • Activate virtual environment: source venv/bin/activate.
    • Install all required dependencies: pip install -r requirements.txt.
    • Launch the api service using command uvicorn app.main:app --reload --port 5004.

REPLICATE_API_TOKEN is needed for LLaVa agent. If you need to use this agent, make sure to include this token in environment variables, such as the Environment Variables on Railway.app.

  1. PocketBase:

    • Switch to the PocketBase directory cd pocketbase.
    • Build the container: docker build -t flowgen-db .
    • Run the container: docker run -it --rm -p 7676:7676 flowgen-db

Each new commit to the main branch triggers an automatic deployment on Railway.app, ensuring you experience the latest version of the service.

Warning

Changes to Pocketbase project will cause the rebuild and redeployment of all instances, which will swipe all the data.

Please do not use it for production purpose, and make sure you export flows in time.

Once you've started both the ui and api services by following the steps previously outlined, you can access the application by opening your web browser and navigating to:

If your services are started successfully and running on the expected ports, you should see the user interface or receive responses from the api services via this URL.

πŸ‘¨β€πŸ’» Contributing

Contributions are welcome! It's not limited to code, but also includes documentation and other aspects of the project. You can leave your comments on the Discord Server.

This project welcomes contributions and suggestions. Please read our Contributing Guide first.

If you are new to GitHub, here is a detailed help source on getting involved with development on GitHub.

Please consider contributing to AutoGen, as FlowGen relies on a robust foundation to deliver its capabilities. Your contributions can help enhance the platform's core functionalities, ensuring a more seamless and efficient development experience for Multi-Agent Applications.

This project uses πŸ“¦πŸš€semantic-release to manage versioning and releases. To avoid too frequent auto-releases, we make it a manual GitHub Action to trigger the release.

To follow the Semantic Release process, we enforced commit-lint convention on commit messages. Please refer to Commitlint for more details.

Contributors Wall

πŸ“ License

The project is licensed under Apache 2.0 with additional terms and conditions.