moe
Here are 108 public repositories matching this topic...
Implementation of the "the first large-scale multimodal mixture of experts models." from the paper: "Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts"
-
Updated
Mar 11, 2024 - Python
-
Updated
May 1, 2017 - Java
Apologies in advance to all Moes. Don't be mad, I love you.
-
Updated
Feb 16, 2023 - Svelte
The most effective and efficient moecounters for your projects, designed to display a wide range of statistics for your website and more!
-
Updated
Mar 21, 2024 - JavaScript
Project Goals: Put all the schools in Singapore on a Google map using their postal code + List them out on a website. Project Admin: Bryan. Give me 1 ⭐ if it’s cool.
-
Updated
Nov 30, 2020 - HTML
A simple project that help visualize expert router choices for text generation
-
Updated
Apr 17, 2024 - Python
This is the repo for the MixKABRN Neural Network (Mixture of Kolmogorov-Arnold Bit Retentive Networks), and an attempt at first adapting it for training on text, and later adjust it for other modalities.
-
Updated
May 14, 2024 - Python
Simple downloader that i've made for twist.moe.
-
Updated
Oct 29, 2020 - Kotlin
⭐ Moe-Counter Compatible Website Hit Counter Written in Gleam
-
Updated
May 14, 2024 - Gleam
Mergekit Assistant is a cutting-edge toolkit designed for the seamless merging of pre-trained language models. It supports an array of models, offers various merging methods, and optimizes for low-resource environments with both CPU and GPU compatibility.
-
Updated
Mar 28, 2024
Meet Moe, a discord bot, written in modern python!
-
Updated
Apr 24, 2023 - Python
-
Updated
Mar 26, 2024 - Jupyter Notebook
Improve this page
Add a description, image, and links to the moe topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the moe topic, visit your repo's landing page and select "manage topics."