| Revolutionizing Disease Diagnosis with simultaneous functional PET/MR and Deeply Integrated Brain Metabolic, Hemodynamic, and Perfusion Networks | Mar 29, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Ring-lite: Scalable Reasoning via C3PO-Stabilized Reinforcement Learning for LLMs | Jun 17, 2025 | Data IntegrationLarge Language Model | —Unverified | 0 |
| RingMoE: Mixture-of-Modality-Experts Multi-Modal Foundation Models for Universal Remote Sensing Image Interpretation | Apr 4, 2025 | Change DetectionDepth Estimation | —Unverified | 0 |
| Robust and Explainable Depression Identification from Speech Using Vowel-Based Ensemble Learning Approaches | Oct 23, 2024 | Ensemble LearningMixture-of-Experts | —Unverified | 0 |
| Robust Audiovisual Speech Recognition Models with Mixture-of-Experts | Sep 19, 2024 | Mixture-of-ExpertsRobust Speech Recognition | —Unverified | 0 |
| Robust Calibration For Improved Weather Prediction Under Distributional Shift | Jan 8, 2024 | Data AugmentationMixture-of-Experts | —Unverified | 0 |
| Robust mixture of experts modeling using the skew t distribution | Dec 9, 2016 | ClusteringMixture-of-Experts | —Unverified | 0 |
| Robust mixture of experts modeling using the t distribution | Dec 9, 2016 | ClusteringMixture-of-Experts | —Unverified | 0 |
| RocketPPA: Code-Level Power, Performance, and Area Prediction via LLM and Mixture of Experts | Mar 27, 2025 | Code RepairFeature Engineering | —Unverified | 0 |
| Routers in Vision Mixture of Experts: An Empirical Study | Jan 29, 2024 | Language ModelingLanguage Modelling | —Unverified | 0 |
| RS-MoE: Mixture of Experts for Remote Sensing Image Captioning and Visual Question Answering | Nov 3, 2024 | DescriptiveImage Captioning | —Unverified | 0 |
| RTM Ensemble Learning Results at Quality Estimation Task | Nov 1, 2020 | Ensemble LearningMixture-of-Experts | —Unverified | 0 |
| RTM Stacking Results for Machine Translation Performance Prediction | Aug 1, 2019 | Machine TranslationMixture-of-Experts | —Unverified | 0 |
| RTM Super Learner Results at Quality Estimation Task | Nov 1, 2021 | Mixture-of-ExpertsTranslation | —Unverified | 0 |
| S2MoE: Robust Sparse Mixture of Experts via Stochastic Learning | Mar 29, 2025 | Mixture-of-Experts | —Unverified | 0 |
| Safe Real-World Autonomous Driving by Learning to Predict and Plan with a Mixture of Experts | Nov 3, 2022 | Autonomous DrivingAutonomous Vehicles | —Unverified | 0 |
| SAFEx: Analyzing Vulnerabilities of MoE-Based LLMs via Stable Safety-critical Expert Identification | Jun 20, 2025 | Mixture-of-ExpertsResponse Generation | —Unverified | 0 |
| SAM-Med3D-MoE: Towards a Non-Forgetting Segment Anything Model via Mixture of Experts for 3D Medical Image Segmentation | Jul 6, 2024 | General KnowledgeImage Segmentation | —Unverified | 0 |
| Scalable and Efficient MoE Training for Multitask Multilingual Models | Sep 22, 2021 | Machine TranslationMixture-of-Experts | —Unverified | 0 |
| Scalable Multi-Domain Adaptation of Language Models using Modular Experts | Oct 14, 2024 | Domain AdaptationGeneral Knowledge | —Unverified | 0 |
| Scalable Neural Data Server: A Data Recommender for Transfer Learning | Jun 19, 2022 | Mixture-of-ExpertsTransfer Learning | —Unverified | 0 |
| Scaling and Enhancing LLM-based AVSR: A Sparse Mixture of Projectors Approach | May 20, 2025 | Audio-Visual Speech RecognitionMixture-of-Experts | —Unverified | 0 |
| Scaling Intelligence: Designing Data Centers for Next-Gen Language Models | Jun 17, 2025 | Mixture-of-Experts | —Unverified | 0 |
| Scaling Laws Across Model Architectures: A Comparative Analysis of Dense and MoE Models in Large Language Models | Oct 8, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Scaling Laws for Native Multimodal Models Scaling Laws for Native Multimodal Models | Apr 10, 2025 | Mixture-of-Experts | —Unverified | 0 |
| Scaling Vision-Language Models with Sparse Mixture of Experts | Mar 13, 2023 | Mixture-of-Experts | —Unverified | 0 |
| SCFCRC: Simultaneously Counteract Feature Camouflage and Relation Camouflage for Fraud Detection | Jan 21, 2025 | Contrastive LearningFraud Detection | —Unverified | 0 |
| SciDFM: A Large Language Model with Mixture-of-Experts for Science | Sep 27, 2024 | Language ModelingLanguage Modelling | —Unverified | 0 |
| SC-MoE: Switch Conformer Mixture of Experts for Unified Streaming and Non-streaming Code-Switching ASR | Jun 26, 2024 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 |
| Security Assessment of DeepSeek and GPT Series Models against Jailbreak Attacks | Jun 23, 2025 | Mixture-of-ExpertsSafety Alignment | —Unverified | 0 |
| Seed1.5-Thinking: Advancing Superb Reasoning Models with Reinforcement Learning | Apr 10, 2025 | Mixture-of-Expertsreinforcement-learning | —Unverified | 0 |
| Seed1.5-VL Technical Report | May 11, 2025 | Mixture-of-ExpertsMultimodal Reasoning | —Unverified | 0 |
| Seeing the Unseen: How EMoE Unveils Bias in Text-to-Image Diffusion Models | May 19, 2025 | FairnessMixture-of-Experts | —Unverified | 0 |
| SEER-MoE: Sparse Expert Efficiency through Regularization for Mixture-of-Experts | Apr 7, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Self-tuned Visual Subclass Learning with Shared Samples An Incremental Approach | May 22, 2014 | ClusteringGeneral Classification | —Unverified | 0 |
| Semantic-Aware Dynamic Parameter for Video Inpainting Transformer | Jan 1, 2023 | Mixture-of-ExpertsVideo Inpainting | —Unverified | 0 |
| Probing Semantic Routing in Large Mixture-of-Expert Models | Feb 15, 2025 | Mixture-of-ExpertsSentence | —Unverified | 0 |
| SemEval-2025 Task 1: AdMIRe -- Advancing Multimodal Idiomaticity Representation | Mar 19, 2025 | Mixture-of-Experts | —Unverified | 0 |
| MoESys: A Distributed and Efficient Mixture-of-Experts Training and Inference System for Internet Services | May 20, 2022 | CPUDistributed Computing | —Unverified | 0 |
| Serving Large Language Models on Huawei CloudMatrix384 | Jun 15, 2025 | Mixture-of-ExpertsQuantization | —Unverified | 0 |
| SwapMoE: Serving Off-the-shelf MoE-based Large Language Models with Tunable Memory Budget | Aug 29, 2023 | Mixture-of-Expertsobject-detection | —Unverified | 0 |
| Shortcut-connected Expert Parallelism for Accelerating Mixture-of-Experts | Apr 7, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Sigmoid Gating is More Sample Efficient than Softmax Gating in Mixture of Experts | May 22, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Sigmoid Self-Attention has Lower Sample Complexity than Softmax Self-Attention: A Mixture-of-Experts Perspective | Feb 1, 2025 | Mixture-of-Experts | —Unverified | 0 |
| Simple or Complex? Complexity-Controllable Question Generation with Soft Templates and Deep Mixture of Experts Model | Oct 13, 2021 | Mixture-of-ExpertsQuestion Generation | —Unverified | 0 |
| SimSMoE: Solving Representational Collapse via Similarity Measure | Jun 22, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Simultaneous Feature and Expert Selection within Mixture of Experts | May 29, 2014 | feature selectionMixture-of-Experts | —Unverified | 0 |
| Single-Example Learning in a Mixture of GPDMs with Latent Geometries | Jun 17, 2025 | Mixture-of-Experts | —Unverified | 0 |
| SkillNet-X: A Multilingual Multitask Model with Sparsely Activated Skills | Jun 28, 2023 | Mixture-of-ExpertsNatural Language Understanding | —Unverified | 0 |
| SMAR: Soft Modality-Aware Routing Strategy for MoE-based Multimodal Large Language Models Preserving Language Capabilities | Jun 6, 2025 | Mixture-of-Experts | —Unverified | 0 |