| AMEND: A Mixture of Experts Framework for Long-tailed Trajectory Prediction | Feb 13, 2024 | Contrastive LearningMixture-of-Experts | —Unverified | 0 | 0 |
| Adaptive Detection of Fast Moving Celestial Objects Using a Mixture of Experts and Physical-Inspired Neural Network | Apr 10, 2025 | Mixture-of-Expertsobject-detection | —Unverified | 0 | 0 |
| How to Upscale Neural Networks with Scaling Law? A Survey and Practical Guidelines | Feb 17, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| How Lightweight Can A Vision Transformer Be | Jul 25, 2024 | Mixture-of-ExpertsTransfer Learning | —Unverified | 0 | 0 |
| How does Architecture Influence the Base Capabilities of Pre-trained Language Models? A Case Study Based on FFN-Wider and MoE Transformers | Mar 4, 2024 | Few-Shot LearningLanguage Modeling | —Unverified | 0 | 0 |
| A Unified Virtual Mixture-of-Experts Framework:Enhanced Inference and Hallucination Mitigation in Single-Model System | Apr 1, 2025 | Dialogue GenerationEnsemble Learning | —Unverified | 0 | 0 |
| How Do Consumers Really Choose: Exposing Hidden Preferences with the Mixture of Experts Model | Mar 3, 2025 | Decision MakingDemand Forecasting | —Unverified | 0 | 0 |
| How Can Cross-lingual Knowledge Contribute Better to Fine-Grained Entity Typing? | May 1, 2022 | Entity TypingMixture-of-Experts | —Unverified | 0 | 0 |
| HOMOE: A Memory-Based and Composition-Aware Framework for Zero-Shot Learning with Hopfield Network and Soft Mixture of Experts | Nov 23, 2023 | Compositional Zero-Shot LearningMixture-of-Experts | —Unverified | 0 | 0 |
| HoME: Hierarchy of Multi-Gate Experts for Multi-Task Learning at Kuaishou | Aug 10, 2024 | Mixture-of-ExpertsMulti-Task Learning | —Unverified | 0 | 0 |
| A Unified Framework for Iris Anti-Spoofing: Introducing IrisGeneral Dataset and Masked-MoE Method | Aug 19, 2024 | Iris RecognitionMixture-of-Experts | —Unverified | 0 | 0 |
| Holistic Capability Preservation: Towards Compact Yet Comprehensive Reasoning Models | Apr 9, 2025 | Instruction FollowingMathematical Problem-Solving | —Unverified | 0 | 0 |
| HOBBIT: A Mixed Precision Expert Offloading System for Fast MoE Inference | Nov 3, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |
| HMOE: Hypernetwork-based Mixture of Experts for Domain Generalization | Nov 15, 2022 | Domain GeneralizationMixture-of-Experts | —Unverified | 0 | 0 |
| HMoE: Heterogeneous Mixture of Experts for Language Modeling | Aug 20, 2024 | Computational EfficiencyLanguage Modeling | —Unverified | 0 | 0 |
| A Unified Approach to Universal Prediction: Generalized Upper and Lower Bounds | Nov 25, 2013 | Learning TheoryMixture-of-Experts | —Unverified | 0 | 0 |
| HiMoE: Heterogeneity-Informed Mixture-of-Experts for Fair Spatial-Temporal Forecasting | Nov 30, 2024 | FairnessMixture-of-Experts | —Unverified | 0 | 0 |
| Hierarchical Routing Mixture of Experts | Mar 18, 2019 | Mixture-of-Expertsregression | —Unverified | 0 | 0 |
| Deep Learning Mixture-of-Experts Approach for Cytotoxic Edema Assessment in Infants and Children | Oct 6, 2022 | image-classificationImage Classification | —Unverified | 0 | 0 |
| A Two-Phase Deep Learning Framework for Adaptive Time-Stepping in High-Speed Flow Modeling | Jun 9, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Alternating Updates for Efficient Transformers | Jan 30, 2023 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Adaptive Conditional Expert Selection Network for Multi-domain Recommendation | Nov 11, 2024 | Computational EfficiencyMixture-of-Experts | —Unverified | 0 | 0 |
| Accelerating Mixture-of-Experts Training with Adaptive Expert Replication | Apr 28, 2025 | GPUMixture-of-Experts | —Unverified | 0 | 0 |
| Hierarchical Mixture-of-Experts Model for Large-Scale Gaussian Process Regression | Dec 9, 2014 | Mixture-of-Expertsregression | —Unverified | 0 | 0 |
| Deep Gaussian Covariance Network | Oct 17, 2017 | Gaussian ProcessesMixture-of-Experts | —Unverified | 0 | 0 |