| HDformer: A Higher Dimensional Transformer for Diabetes Detection Utilizing Long Range Vascular Signals | Mar 17, 2023 | Computational EfficiencyMixture-of-Experts | —Unverified | 0 | 0 |
| HeterMoE: Efficient Training of Mixture-of-Experts Models on Heterogeneous GPUs | Apr 4, 2025 | GPUMixture-of-Experts | —Unverified | 0 | 0 |
| Heuristic-Informed Mixture of Experts for Link Prediction in Multilayer Networks | Jan 29, 2025 | Link PredictionMixture-of-Experts | —Unverified | 0 | 0 |
| Hierarchical mixture of discriminative Generalized Dirichlet classifiers | May 2, 2024 | Mixture-of-ExpertsSpam detection | —Unverified | 0 | 0 |
| Hierarchical Mixture-of-Experts Model for Large-Scale Gaussian Process Regression | Dec 9, 2014 | Mixture-of-Expertsregression | —Unverified | 0 | 0 |
| Hierarchical Routing Mixture of Experts | Mar 18, 2019 | Mixture-of-Expertsregression | —Unverified | 0 | 0 |
| HiMoE: Heterogeneity-Informed Mixture-of-Experts for Fair Spatial-Temporal Forecasting | Nov 30, 2024 | FairnessMixture-of-Experts | —Unverified | 0 | 0 |
| HMoE: Heterogeneous Mixture of Experts for Language Modeling | Aug 20, 2024 | Computational EfficiencyLanguage Modeling | —Unverified | 0 | 0 |
| HMOE: Hypernetwork-based Mixture of Experts for Domain Generalization | Nov 15, 2022 | Domain GeneralizationMixture-of-Experts | —Unverified | 0 | 0 |
| HOBBIT: A Mixed Precision Expert Offloading System for Fast MoE Inference | Nov 3, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |