| MoRE-Brain: Routed Mixture of Experts for Interpretable and Generalizable Cross-Subject fMRI Visual Decoding | May 21, 2025 | Mixture-of-Experts | CodeCode Available | 0 | 5 |
| More Experts Than Galaxies: Conditionally-overlapping Experts With Biologically-Inspired Fixed Routing | Oct 10, 2024 | image-classificationImage Classification | CodeCode Available | 0 | 5 |
| MoNTA: Accelerating Mixture-of-Experts Training with Network-Traffc-Aware Parallel Optimization | Nov 1, 2024 | 8kMixture-of-Experts | CodeCode Available | 0 | 5 |
| MoLEx: Mixture of Layer Experts for Finetuning with Sparse Upcycling | Mar 14, 2025 | Mixture-of-Expertsparameter-efficient fine-tuning | CodeCode Available | 0 | 5 |
| Mol-MoE: Training Preference-Guided Routers for Molecule Generation | Feb 8, 2025 | BenchmarkingDrug Design | CodeCode Available | 0 | 5 |
| MOoSE: Multi-Orientation Sharing Experts for Open-set Scene Text Recognition | Jul 26, 2024 | Mixture-of-ExpertsScene Text Recognition | CodeCode Available | 0 | 5 |
| Mosaic: Data-Free Knowledge Distillation via Mixture-of-Experts for Heterogeneous Distributed Environments | May 26, 2025 | Data-free Knowledge DistillationFederated Learning | CodeCode Available | 0 | 5 |
| AskChart: Universal Chart Understanding through Textual Enhancement | Dec 26, 2024 | Chart UnderstandingMixture-of-Experts | CodeCode Available | 0 | 5 |
| MoE-MLoRA for Multi-Domain CTR Prediction: Efficient Adaptation with Expert Specialization | Jun 9, 2025 | Click-Through Rate PredictionDiversity | CodeCode Available | 0 | 5 |
| MoE-LPR: Multilingual Extension of Large Language Models through Mixture-of-Experts with Language Priors Routing | Aug 21, 2024 | Mixture-of-Experts | CodeCode Available | 0 | 5 |