| Simple or Complex? Complexity-Controllable Question Generation with Soft Templates and Deep Mixture of Experts Model | Oct 13, 2021 | Mixture-of-ExpertsQuestion Generation | —Unverified | 0 | 0 |
| SimSMoE: Solving Representational Collapse via Similarity Measure | Jun 22, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Simultaneous Feature and Expert Selection within Mixture of Experts | May 29, 2014 | feature selectionMixture-of-Experts | —Unverified | 0 | 0 |
| Single-Example Learning in a Mixture of GPDMs with Latent Geometries | Jun 17, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| SkillNet-X: A Multilingual Multitask Model with Sparsely Activated Skills | Jun 28, 2023 | Mixture-of-ExpertsNatural Language Understanding | —Unverified | 0 | 0 |
| SMAR: Soft Modality-Aware Routing Strategy for MoE-based Multimodal Large Language Models Preserving Language Capabilities | Jun 6, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| SMILE: Scaling Mixture-of-Experts with Efficient Bi-level Routing | Dec 10, 2022 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Sparse Diffusion Policy: A Sparse, Reusable, and Flexible Policy for Robot Learning | Jul 1, 2024 | Continual LearningMixture-of-Experts | —Unverified | 0 | 0 |
| Sparsely Activated Mixture-of-Experts are Robust Multi-Task Learners | Jan 16, 2022 | Mixture-of-ExpertsMulti-Task Learning | —Unverified | 0 | 0 |
| Sparsely Activated Mixture-of-Experts are Robust Multi-Task Learners | Apr 16, 2022 | Mixture-of-ExpertsMulti-Task Learning | —Unverified | 0 | 0 |