| Mod-Squad: Designing Mixtures of Experts As Modular Multi-Task Learners | Jan 1, 2023 | Mixture-of-ExpertsMulti-Task Learning | —Unverified | 0 | 0 |
| Modularity Matters: Learning Invariant Relational Reasoning Tasks | Jun 18, 2018 | Mixture-of-ExpertsRelational Reasoning | —Unverified | 0 | 0 |
| MoE-AMC: Enhancing Automatic Modulation Classification Performance Using Mixture-of-Experts | Dec 4, 2023 | ClassificationMixture-of-Experts | —Unverified | 0 | 0 |
| MoEBERT: from BERT to Mixture-of-Experts via Importance-Guided Adaptation | Jan 16, 2022 | Knowledge DistillationMixture-of-Experts | —Unverified | 0 | 0 |
| MoE-CAP: Benchmarking Cost, Accuracy and Performance of Sparse Mixture-of-Experts Systems | Dec 10, 2024 | BenchmarkingMixture-of-Experts | —Unverified | 0 | 0 |
| MoEC: Mixture of Expert Clusters | Jul 19, 2022 | Machine TranslationMixture-of-Experts | —Unverified | 0 | 0 |
| MoEC: Mixture of Experts Implicit Neural Compression | Dec 3, 2023 | Data CompressionMixture-of-Experts | —Unverified | 0 | 0 |
| MoE-DiffIR: Task-customized Diffusion Priors for Universal Compressed Image Restoration | Jul 15, 2024 | Image RestorationMixture-of-Experts | —Unverified | 0 | 0 |
| MoEfication: Conditional Computation of Transformer Models for Efficient Inference | Nov 16, 2021 | Mixture-of-Experts | —Unverified | 0 | 0 |
| MoE-GPS: Guidlines for Prediction Strategy for Dynamic Expert Duplication in MoE Load Balancing | Jun 9, 2025 | GPUMixture-of-Experts | —Unverified | 0 | 0 |
| MoE-Gyro: Self-Supervised Over-Range Reconstruction and Denoising for MEMS Gyroscopes | May 27, 2025 | BenchmarkingDenoising | —Unverified | 0 | 0 |
| MoE-Lens: Towards the Hardware Limit of High-Throughput MoE LLM Serving Under Resource Constraints | Apr 12, 2025 | CPUGPU | —Unverified | 0 | 0 |
| MoE-Lightning: High-Throughput MoE Inference on Memory-constrained GPUs | Nov 18, 2024 | Computational EfficiencyCPU | —Unverified | 0 | 0 |
| MoE-Loco: Mixture of Experts for Multitask Locomotion | Mar 11, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| MoELoRA: Contrastive Learning Guided Mixture of Experts on Parameter-Efficient Fine-Tuning for Large Language Models | Feb 20, 2024 | Common Sense ReasoningContrastive Learning | —Unverified | 0 | 0 |
| MoEMba: A Mamba-based Mixture of Experts for High-Density EMG-based Hand Gesture Recognition | Feb 9, 2025 | Gesture RecognitionHand Gesture Recognition | —Unverified | 0 | 0 |
| MoEMoE: Question Guided Dense and Scalable Sparse Mixture-of-Expert for Multi-source Multi-modal Answering | Mar 8, 2025 | Answer GenerationMixture-of-Experts | —Unverified | 0 | 0 |
| MoENAS: Mixture-of-Expert based Neural Architecture Search for jointly Accurate, Fair, and Robust Edge Deep Neural Networks | Feb 11, 2025 | Fairnessimage-classification | —Unverified | 0 | 0 |
| MoE Parallel Folding: Heterogeneous Parallelism Mappings for Efficient Large-Scale MoE Model Training with Megatron Core | Apr 21, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| MoE-Pruner: Pruning Mixture-of-Experts Large Language Model using the Hints from Its Router | Oct 15, 2024 | Knowledge DistillationLanguage Modeling | —Unverified | 0 | 0 |
| MoESD: Mixture of Experts Stable Diffusion to Mitigate Gender Bias | Jun 25, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |
| MoESD: Unveil Speculative Decoding's Potential for Accelerating Sparse MoE | May 26, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| MoE-SPNet: A Mixture-of-Experts Scene Parsing Network | Jun 19, 2018 | Mixture-of-ExpertsScene Parsing | —Unverified | 0 | 0 |
| MoET: Interpretable and Verifiable Reinforcement Learning via Mixture of Expert Trees | Sep 25, 2019 | Deep Reinforcement LearningGame of Go | —Unverified | 0 | 0 |
| MoETuner: Optimized Mixture of Expert Serving with Balanced Expert Placement and Token Routing | Feb 10, 2025 | GPUMixture-of-Experts | —Unverified | 0 | 0 |