| Towards Vision Mixture of Experts for Wildlife Monitoring on the Edge | Nov 12, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Training-efficient density quantum machine learning | May 30, 2024 | LEMMAMixture-of-Experts | —Unverified | 0 | 0 |
| Training of Neural Networks with Uncertain Data: A Mixture of Experts Approach | Dec 13, 2023 | Autonomous DrivingMixture-of-Experts | —Unverified | 0 | 0 |
| TrajMoE: Spatially-Aware Mixture of Experts for Unified Human Mobility Modeling | May 24, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Transformer Layer Injection: A Novel Approach for Efficient Upscaling of Large Language Models | Oct 15, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Tree-gated Deep Mixture-of-Experts For Pose-robust Face Alignment | Oct 21, 2019 | Face AlignmentMixture-of-Experts | —Unverified | 0 | 0 |
| Trend Filtered Mixture of Experts for Automated Gating of High-Frequency Flow Cytometry Data | Apr 16, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Towards Incremental Learning in Large Language Models: A Critical Review | Apr 28, 2024 | Continual LearningIncremental Learning | —Unverified | 0 | 0 |
| True Zero-Shot Inference of Dynamical Systems Preserving Long-Term Statistics | May 19, 2025 | Mixture-of-ExpertsTime Series | —Unverified | 0 | 0 |
| TS-RAG: Retrieval-Augmented Generation based Time Series Foundation Models are Stronger Zero-Shot Forecaster | Mar 6, 2025 | Domain AdaptationMixture-of-Experts | —Unverified | 0 | 0 |
| Tuning of Mixture-of-Experts Mixed-Precision Neural Networks | Sep 29, 2022 | image-classificationImage Classification | —Unverified | 0 | 0 |
| Turn Waste into Worth: Rectifying Top-k Router of MoE | Feb 17, 2024 | Computational EfficiencyGPU | —Unverified | 0 | 0 |
| Two Experts Are All You Need for Steering Thinking: Reinforcing Cognitive Effort in MoE Reasoning Models Without Additional Training | May 20, 2025 | AllDomain Generalization | —Unverified | 0 | 0 |
| Two Is Better Than One: Rotations Scale LoRAs | May 29, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| U2++ MoE: Scaling 4.7x parameters with minimal impact on RTF | Apr 25, 2024 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 | 0 |
| UGG-ReID: Uncertainty-Guided Graph Model for Multi-Modal Object Re-Identification | Jul 7, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Fast Deep Mixtures of Gaussian Process Experts | Jun 11, 2020 | Gaussian ProcessesMixture-of-Experts | —Unverified | 0 | 0 |
| Ultra-Sparse Memory Network | Nov 19, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |
| UME: Upcycling Mixture-of-Experts for Scalable and Efficient Automatic Speech Recognition | Dec 23, 2024 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 | 0 |
| UMoE: Unifying Attention and FFN with Shared Experts | May 12, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Unbiased Gradient Estimation with Balanced Assignments for Mixtures of Experts | Sep 24, 2021 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Uncertainty-Aware Driver Trajectory Prediction at Urban Intersections | Jan 16, 2019 | Mixture-of-ExpertsPrediction | —Unverified | 0 | 0 |
| Uncertainty-Encoded Multi-Modal Fusion for Robust Object Detection in Autonomous Driving | Jul 30, 2023 | Autonomous DrivingMixture-of-Experts | —Unverified | 0 | 0 |
| Understanding Expert Structures on Minimax Parameter Estimation in Contaminated Mixture of Experts | Oct 16, 2024 | Mixture-of-Expertsparameter estimation | —Unverified | 0 | 0 |
| UniAdapt: A Universal Adapter for Knowledge Calibration | Oct 1, 2024 | Mixture-of-ExpertsModel Editing | —Unverified | 0 | 0 |
| UNIALIGN: Scaling Multimodal Alignment within One Unified Model | Jan 1, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| UniCodec: Unified Audio Codec with Single Domain-Adaptive Codebook | Feb 27, 2025 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |
| UniCoRN: Latent Diffusion-based Unified Controllable Image Restoration Network across Multiple Degradations | Mar 20, 2025 | Image RestorationMixture-of-Experts | —Unverified | 0 | 0 |
| Unified Modeling of Multi-Domain Multi-Device ASR Systems | May 13, 2022 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 | 0 |
| Unify and Anchor: A Context-Aware Transformer for Cross-Domain Time Series Forecasting | Mar 3, 2025 | Domain GeneralizationMixture-of-Experts | —Unverified | 0 | 0 |
| Unimodal-driven Distillation in Multimodal Emotion Recognition with Dynamic Fusion | Mar 31, 2025 | Emotion RecognitionKnowledge Distillation | —Unverified | 0 | 0 |
| UniPaint: Unified Space-time Video Inpainting via Mixture-of-Experts | Dec 9, 2024 | Mixture-of-ExpertsVideo Inpainting | —Unverified | 0 | 0 |
| UniF^2ace: Fine-grained Face Understanding and Generation with Unified Multimodal Models | Mar 11, 2025 | AttributeMixture-of-Experts | —Unverified | 0 | 0 |
| UniUIR: Considering Underwater Image Restoration as An All-in-One Learner | Jan 22, 2025 | AllDepth Estimation | —Unverified | 0 | 0 |
| Unraveling the Localized Latents: Learning Stratified Manifold Structures in LLM Embedding Space with Sparse Mixture-of-Experts | Feb 19, 2025 | Dictionary LearningMixture-of-Experts | —Unverified | 0 | 0 |
| Unveiling Hidden Collaboration within Mixture-of-Experts in Large Language Models | Apr 16, 2025 | Dictionary LearningMixture-of-Experts | —Unverified | 0 | 0 |
| UOE: Unlearning One Expert Is Enough For Mixture-of-experts LLMS | Nov 27, 2024 | Large Language ModelMixture-of-Experts | —Unverified | 0 | 0 |
| Upcycling Instruction Tuning from Dense to Mixture-of-Experts via Parameter Merging | Oct 2, 2024 | DiversityMixture-of-Experts | —Unverified | 0 | 0 |
| Upcycling Large Language Models into Mixture of Experts | Oct 10, 2024 | Mixture-of-ExpertsMMLU | —Unverified | 0 | 0 |
| Using Deep Mixture-of-Experts to Detect Word Meaning Shift for TempoWiC | Nov 7, 2022 | Data AugmentationMixture-of-Experts | —Unverified | 0 | 0 |
| Utility-Driven Speculative Decoding for Mixture-of-Experts | Jun 17, 2025 | GPULarge Language Model | —Unverified | 0 | 0 |
| Vanilla Transformers are Transfer Capability Teachers | Mar 4, 2024 | Computational EfficiencyMixture-of-Experts | —Unverified | 0 | 0 |
| Variational Distillation of Diffusion Policies into Mixture of Experts | Jun 18, 2024 | DenoisingMixture-of-Experts | —Unverified | 0 | 0 |
| Variational Mixture of Gaussian Process Experts | Dec 1, 2008 | Gaussian ProcessesMixture-of-Experts | —Unverified | 0 | 0 |
| ViMoE: An Empirical Study of Designing Vision Mixture-of-Experts | Oct 21, 2024 | image-classificationImage Classification | —Unverified | 0 | 0 |
| Visual Saliency Prediction Using a Mixture of Deep Neural Networks | Feb 1, 2017 | Mixture-of-ExpertsSaliency Prediction | —Unverified | 0 | 0 |
| WDMoE: Wireless Distributed Large Language Models with Mixture of Experts | May 6, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |
| WDMoE: Wireless Distributed Mixture of Experts for Large Language Models | Nov 11, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |
| WeNet: Weighted Networks for Recurrent Network Architecture Search | Apr 8, 2019 | General Classificationimage-classification | —Unverified | 0 | 0 |
| Who Says Elephants Can't Run: Bringing Large Scale MoE Models into Cloud Scale Production | Nov 18, 2022 | Machine TranslationMixture-of-Experts | —Unverified | 0 | 0 |