| MoEMba: A Mamba-based Mixture of Experts for High-Density EMG-based Hand Gesture Recognition | Feb 9, 2025 | Gesture RecognitionHand Gesture Recognition | —Unverified | 0 | 0 |
| MoEMoE: Question Guided Dense and Scalable Sparse Mixture-of-Expert for Multi-source Multi-modal Answering | Mar 8, 2025 | Answer GenerationMixture-of-Experts | —Unverified | 0 | 0 |
| MoENAS: Mixture-of-Expert based Neural Architecture Search for jointly Accurate, Fair, and Robust Edge Deep Neural Networks | Feb 11, 2025 | Fairnessimage-classification | —Unverified | 0 | 0 |
| MoE Parallel Folding: Heterogeneous Parallelism Mappings for Efficient Large-Scale MoE Model Training with Megatron Core | Apr 21, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| MoE-Pruner: Pruning Mixture-of-Experts Large Language Model using the Hints from Its Router | Oct 15, 2024 | Knowledge DistillationLanguage Modeling | —Unverified | 0 | 0 |
| MoESD: Mixture of Experts Stable Diffusion to Mitigate Gender Bias | Jun 25, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |
| MoESD: Unveil Speculative Decoding's Potential for Accelerating Sparse MoE | May 26, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| MoE-SPNet: A Mixture-of-Experts Scene Parsing Network | Jun 19, 2018 | Mixture-of-ExpertsScene Parsing | —Unverified | 0 | 0 |
| MoET: Interpretable and Verifiable Reinforcement Learning via Mixture of Expert Trees | Sep 25, 2019 | Deep Reinforcement LearningGame of Go | —Unverified | 0 | 0 |
| MoETuner: Optimized Mixture of Expert Serving with Balanced Expert Placement and Token Routing | Feb 10, 2025 | GPUMixture-of-Experts | —Unverified | 0 | 0 |