Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow to be compiled headlessly in a Docker container #105

Open
wants to merge 4 commits into
base: develop
Choose a base branch
from

Conversation

token-cjg
Copy link

@token-cjg token-cjg commented Mar 26, 2022

TLDR

This change allows for Protongraph to be compiled so as to run headlessly in a Docker container. Running Protongraph this way allows a user to produce to Kafka (should they provide within the config folder a valid kafka.config and valid secrets if applicable for their Kafka process).

Detail

This change adds:

  • A Dockerfile.compile to provide an isolated build process, so that the linux tool chain is used to compile Protongraph,
  • A Dockerfile to run the Protongraph process headlessly in linux,
  • A number of optimisations to tooling, SConstruct, and make files to properly support linux compilation,
  • Some changes to the Readme to indicate how this change impacts building Protongraph.

In particular, this should allow for one to be able to produce to Kafka when running this process locally.

To compile for Docker, run ./compile.sh. Scripts to start, stop and debug the process are in the scripts folder.

To compile for OsX as per previous pull request, run make osx.

Synopsis

After this change, one can run Protongraph in default responder mode (wherein it runs locally and responds to messages from a local Godot game with sync-godot added as a plugin) or in Kafka producer mode.

Default Responder mode is best if one plans to run Protongraph locally on a single machine along with eg a Godot game.

For the more advanced usecase for running in Kafka producer mode, one may wish to model things broadly as follows:

protongraph

Basically, send messages to a Signalling server, produce to Kafka, and then consume from Kafka and send messages via websocket to Protongraph (this process), then produce back to Kafka again for a Signalling server consumer to process, before sending back to the Signalling server, and then back to any connected clients.

To take full advantage of the capabilities of Protongraph when running in this mode, each and every process (save for the user-defined Client process, e.g. Godot game) should be deployed to the cloud.

* Run load_or_create_config on init

* Document payload, debug statements

* Add start and stop commands

* Add script to get binaries

* Invoke obtain_binaries from main compile script

* Ensure that we can perform work on the input
* WIP

* Fix build

* Complete fix for remote build
@fire
Copy link
Collaborator

fire commented Jun 16, 2023

Sorry, I think this got dropped into limbo, are you able to talk more about the usage of this?

@token-cjg
Copy link
Author

@fire, certainly. It has been some time since I looked at this, but the general idea was to allow Protongraph to be run as a Docker process, so that one could deploy it to the cloud, and then use event-based architecture in order to send messages indirectly to it, and then produce messages from it back to a separate Kafka topic.

(Apologies if the following message is rather verbose, communication is not my strong suit. Hopefully this information provides sufficient context regarding the general intent of this contribution when I originally made it though, and is of use to you + other maintainers, as well as the founder HungryProton in setting potential future direction for the Protongraph project)

In brief

To unpack this a little bit more, the general idea was to work towards the capability to have networked procedural generation, i.e. be able to have a networked Godot game where Protongraph could run as a background service and then produce procedurally generated content per Protongraph tpgn files on the fly.

In more detail

In some more detail, to build this out, the general idea was:

  • Leveraging the Godot game engine (although doesn't need to be Godot), communicate the intent to execute a Protongraph tree leveraging a Protongraph plugin (i.e., https://github.com/protongraph/sync-godot) via Websocket to a signalling server (I used Node.js, but, again, this could be any webservice that leverages a websockets library),
  • The signalling server identifies the intent to execute a Protongraph tree and produces to a Kafka topic,
  • Have some Kafka architecture (Amazon MSK or one's own Kafka cluster) with topics, which then receives this message,
  • Have a separate Kafka consumer sitting somewhere else in the cloud receive this message,
  • This message is then sent to the Protongraph process running in Docker, which contains the relevant Protongraph tree (tpgn file).

After this, the idea was to

  • stream the response by producing back to a different Kafka topic from Protongraph (I managed to get this working in a separate change)
  • create a consumer to listen to this Kafka topic,
  • send a message from that consumer to the signalling server,
  • interpret that message and then broadcast it back to the original game that requested the message to be sent.

(The advantage of doing this is that each and every user connected to a networked experience can get the same procedurally generated information streamed to their game during runtime, which is potentially quite powerful.)

I managed to get this all working in prototype code, I can search for it if you are interested.

Limitations / follow up thoughts / miscellany

The main limitation I ran into while looking into this was the lack of a natural API for working with the node hierarchy in Godot, so I ended up writing a lot of ugly hacked together code that didn't feel clean or particularly maintainable. One potential solution is to leverage something like Pixar's universal scene description, although I'm not sure exactly what would be involved in that and/or how it could interoperate with the Godot game engine / whether any modules might need to be built with GdExtension to facilitate that.

I did come across a separate project called Blackjack which is built around the godot-rust ecosystem. The Blackjack project seems to follow the Unix philosophy of having small modules / services that focus on doing one thing really well.

@fire
Copy link
Collaborator

fire commented Jun 19, 2023

Here are some of my thoughts.

  1. Godot Engine's Native glTF Exporter: The Godot Engine features a native glTF exporter, which can potentially be utilized for exporting 3D models and scenes generated by Protongraph. More information is available in the official documentation.

  2. Protongraph on Godot Engine 4.0: Protongraph is now compatible with Godot Engine 4.0, offering new features and enhancements over previous versions.

  3. Membrane Guide for Pipelines: The Membrane Guide serves as a valuable resource for constructing media processing pipelines.

  4. Website Mesh Extractor: An ArchiveBox issue highlights the need for a website mesh extractor, which could be beneficial in the context of networked procedural generation.

  5. OpenMfx Support: OpenMfx is a project offering support for Blender, C++, and Houdini, and may be relevant for integrating Protongraph with other tools and platforms.

  6. Challenges with USD: Universal Scene Description (USD) is difficult to recreate for entities not affiliated with Apple. Although tinyusdz attempted this, it lacks animation or skeleton support.

  7. Khepri for Erlang and Elixir: Khepri is a tree-like replicated on-disk database library for Erlang and Elixir, useful for managing data in networked procedural generation projects.

  8. V-Sekai Project: The V-Sekai project is developing a free, open-source virtual reality platform, aiming to make the platform user-friendly and cater to the VR community's needs.

  9. Wings3D: Wings3D is an advanced subdivision 3D modeler offering a wide range of modeling tools and supporting various file formats for import and export. It can be a valuable tool for creating and editing 3D models in the context of networked procedural generation using Protongraph.

  10. Large Language Model Chatbots: AI-powered chatbots, such as large language models, can provide valuable assistance with planning, design, and coding projects. They can help answer questions, suggest best practices, and guide you through the development process.

  11. LibGodot: LibGodot is a system that enables Godot to be compiled as a library and connected to it using GDExtensions. By providing a function pointer to the entry point of a GDExtension implementation before starting Godot, LibGodot adds language support for languages with C support but cannot be compiled into a shared library.

  12. Elixir Phoenix: Elixir Phoenix is a powerful web framework that can be used to serve REST calls, making it an excellent choice for building scalable and maintainable web applications.

  13. Elixir Nx: Elixir Nx is a library for numerical computing that supports GPU-accelerated compute jobs. This enables faster processing of large-scale data and complex calculations, making it suitable for machine learning, scientific computing, and other high-performance tasks.

@fire

This comment was marked as outdated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants