| Revolutionizing Disease Diagnosis with simultaneous functional PET/MR and Deeply Integrated Brain Metabolic, Hemodynamic, and Perfusion Networks | Mar 29, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Ring-lite: Scalable Reasoning via C3PO-Stabilized Reinforcement Learning for LLMs | Jun 17, 2025 | Data IntegrationLarge Language Model | —Unverified | 0 |
| RingMoE: Mixture-of-Modality-Experts Multi-Modal Foundation Models for Universal Remote Sensing Image Interpretation | Apr 4, 2025 | Change DetectionDepth Estimation | —Unverified | 0 |
| Robust and Explainable Depression Identification from Speech Using Vowel-Based Ensemble Learning Approaches | Oct 23, 2024 | Ensemble LearningMixture-of-Experts | —Unverified | 0 |
| Robust Audiovisual Speech Recognition Models with Mixture-of-Experts | Sep 19, 2024 | Mixture-of-ExpertsRobust Speech Recognition | —Unverified | 0 |
| Robust Calibration For Improved Weather Prediction Under Distributional Shift | Jan 8, 2024 | Data AugmentationMixture-of-Experts | —Unverified | 0 |
| Robust mixture of experts modeling using the skew t distribution | Dec 9, 2016 | ClusteringMixture-of-Experts | —Unverified | 0 |
| Robust mixture of experts modeling using the t distribution | Dec 9, 2016 | ClusteringMixture-of-Experts | —Unverified | 0 |
| RocketPPA: Code-Level Power, Performance, and Area Prediction via LLM and Mixture of Experts | Mar 27, 2025 | Code RepairFeature Engineering | —Unverified | 0 |
| Routers in Vision Mixture of Experts: An Empirical Study | Jan 29, 2024 | Language ModelingLanguage Modelling | —Unverified | 0 |
| RS-MoE: Mixture of Experts for Remote Sensing Image Captioning and Visual Question Answering | Nov 3, 2024 | DescriptiveImage Captioning | —Unverified | 0 |
| RTM Ensemble Learning Results at Quality Estimation Task | Nov 1, 2020 | Ensemble LearningMixture-of-Experts | —Unverified | 0 |
| RTM Stacking Results for Machine Translation Performance Prediction | Aug 1, 2019 | Machine TranslationMixture-of-Experts | —Unverified | 0 |
| RTM Super Learner Results at Quality Estimation Task | Nov 1, 2021 | Mixture-of-ExpertsTranslation | —Unverified | 0 |
| S2MoE: Robust Sparse Mixture of Experts via Stochastic Learning | Mar 29, 2025 | Mixture-of-Experts | —Unverified | 0 |
| Safe Real-World Autonomous Driving by Learning to Predict and Plan with a Mixture of Experts | Nov 3, 2022 | Autonomous DrivingAutonomous Vehicles | —Unverified | 0 |
| SAFEx: Analyzing Vulnerabilities of MoE-Based LLMs via Stable Safety-critical Expert Identification | Jun 20, 2025 | Mixture-of-ExpertsResponse Generation | —Unverified | 0 |
| SAM-Med3D-MoE: Towards a Non-Forgetting Segment Anything Model via Mixture of Experts for 3D Medical Image Segmentation | Jul 6, 2024 | General KnowledgeImage Segmentation | —Unverified | 0 |
| Scalable and Efficient MoE Training for Multitask Multilingual Models | Sep 22, 2021 | Machine TranslationMixture-of-Experts | —Unverified | 0 |
| Scalable Multi-Domain Adaptation of Language Models using Modular Experts | Oct 14, 2024 | Domain AdaptationGeneral Knowledge | —Unverified | 0 |
| Scalable Neural Data Server: A Data Recommender for Transfer Learning | Jun 19, 2022 | Mixture-of-ExpertsTransfer Learning | —Unverified | 0 |
| Scaling and Enhancing LLM-based AVSR: A Sparse Mixture of Projectors Approach | May 20, 2025 | Audio-Visual Speech RecognitionMixture-of-Experts | —Unverified | 0 |
| Scaling Intelligence: Designing Data Centers for Next-Gen Language Models | Jun 17, 2025 | Mixture-of-Experts | —Unverified | 0 |
| Scaling Laws Across Model Architectures: A Comparative Analysis of Dense and MoE Models in Large Language Models | Oct 8, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Scaling Laws for Native Multimodal Models Scaling Laws for Native Multimodal Models | Apr 10, 2025 | Mixture-of-Experts | —Unverified | 0 |