| RS-MoE: Mixture of Experts for Remote Sensing Image Captioning and Visual Question Answering | Nov 3, 2024 | DescriptiveImage Captioning | —Unverified | 0 |
| RTM Ensemble Learning Results at Quality Estimation Task | Nov 1, 2020 | Ensemble LearningMixture-of-Experts | —Unverified | 0 |
| RTM Stacking Results for Machine Translation Performance Prediction | Aug 1, 2019 | Machine TranslationMixture-of-Experts | —Unverified | 0 |
| RTM Super Learner Results at Quality Estimation Task | Nov 1, 2021 | Mixture-of-ExpertsTranslation | —Unverified | 0 |
| S2MoE: Robust Sparse Mixture of Experts via Stochastic Learning | Mar 29, 2025 | Mixture-of-Experts | —Unverified | 0 |
| Safe Real-World Autonomous Driving by Learning to Predict and Plan with a Mixture of Experts | Nov 3, 2022 | Autonomous DrivingAutonomous Vehicles | —Unverified | 0 |
| SAFEx: Analyzing Vulnerabilities of MoE-Based LLMs via Stable Safety-critical Expert Identification | Jun 20, 2025 | Mixture-of-ExpertsResponse Generation | —Unverified | 0 |
| SAM-Med3D-MoE: Towards a Non-Forgetting Segment Anything Model via Mixture of Experts for 3D Medical Image Segmentation | Jul 6, 2024 | General KnowledgeImage Segmentation | —Unverified | 0 |
| Scalable and Efficient MoE Training for Multitask Multilingual Models | Sep 22, 2021 | Machine TranslationMixture-of-Experts | —Unverified | 0 |
| Scalable Multi-Domain Adaptation of Language Models using Modular Experts | Oct 14, 2024 | Domain AdaptationGeneral Knowledge | —Unverified | 0 |