| Trend Filtered Mixture of Experts for Automated Gating of High-Frequency Flow Cytometry Data | Apr 16, 2025 | Mixture-of-Experts | —Unverified | 0 |
| Towards Incremental Learning in Large Language Models: A Critical Review | Apr 28, 2024 | Continual LearningIncremental Learning | —Unverified | 0 |
| True Zero-Shot Inference of Dynamical Systems Preserving Long-Term Statistics | May 19, 2025 | Mixture-of-ExpertsTime Series | —Unverified | 0 |
| TS-RAG: Retrieval-Augmented Generation based Time Series Foundation Models are Stronger Zero-Shot Forecaster | Mar 6, 2025 | Domain AdaptationMixture-of-Experts | —Unverified | 0 |
| Tuning of Mixture-of-Experts Mixed-Precision Neural Networks | Sep 29, 2022 | image-classificationImage Classification | —Unverified | 0 |
| Turn Waste into Worth: Rectifying Top-k Router of MoE | Feb 17, 2024 | Computational EfficiencyGPU | —Unverified | 0 |
| Two Experts Are All You Need for Steering Thinking: Reinforcing Cognitive Effort in MoE Reasoning Models Without Additional Training | May 20, 2025 | AllDomain Generalization | —Unverified | 0 |
| Two Is Better Than One: Rotations Scale LoRAs | May 29, 2025 | Mixture-of-Experts | —Unverified | 0 |
| U2++ MoE: Scaling 4.7x parameters with minimal impact on RTF | Apr 25, 2024 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 |
| UGG-ReID: Uncertainty-Guided Graph Model for Multi-Modal Object Re-Identification | Jul 7, 2025 | Mixture-of-Experts | —Unverified | 0 |