| Tuning of Mixture-of-Experts Mixed-Precision Neural Networks | Sep 29, 2022 | image-classificationImage Classification | —Unverified | 0 | 0 |
| Turn Waste into Worth: Rectifying Top-k Router of MoE | Feb 17, 2024 | Computational EfficiencyGPU | —Unverified | 0 | 0 |
| Two Experts Are All You Need for Steering Thinking: Reinforcing Cognitive Effort in MoE Reasoning Models Without Additional Training | May 20, 2025 | AllDomain Generalization | —Unverified | 0 | 0 |
| Two Is Better Than One: Rotations Scale LoRAs | May 29, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| U2++ MoE: Scaling 4.7x parameters with minimal impact on RTF | Apr 25, 2024 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 | 0 |
| UGG-ReID: Uncertainty-Guided Graph Model for Multi-Modal Object Re-Identification | Jul 7, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Fast Deep Mixtures of Gaussian Process Experts | Jun 11, 2020 | Gaussian ProcessesMixture-of-Experts | —Unverified | 0 | 0 |
| Ultra-Sparse Memory Network | Nov 19, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |
| UME: Upcycling Mixture-of-Experts for Scalable and Efficient Automatic Speech Recognition | Dec 23, 2024 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 | 0 |
| UMoE: Unifying Attention and FFN with Shared Experts | May 12, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |