Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
-
Updated
Jul 20, 2025
Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
Official implementation of "Optimal Sparsity of Mixture-of-Experts Language Models for Reasoning Tasks"
Add a description, image, and links to the sparse-mixture-of-experts topic page so that developers can more easily learn about it.
To associate your repository with the sparse-mixture-of-experts topic, visit your repo's landing page and select "manage topics."