| Beyond the Typical: Modeling Rare Plausible Patterns in Chemical Reactions by Leveraging Sequential Mixture-of-Experts | Oct 7, 2023 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Modeling Task Relationships in Multi-variate Soft Sensor with Balanced Mixture-of-Experts | May 25, 2023 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Model Merging in Pre-training of Large Language Models | May 17, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Model Selection for Gaussian-gated Gaussian Mixture of Experts Using Dendrograms of Mixing Measures | May 19, 2025 | Computational EfficiencyEnsemble Learning | —Unverified | 0 | 0 |
| Mod-Squad: Designing Mixture of Experts As Modular Multi-Task Learners | Dec 15, 2022 | Mixture-of-ExpertsMulti-Task Learning | —Unverified | 0 | 0 |
| Mod-Squad: Designing Mixtures of Experts As Modular Multi-Task Learners | Jan 1, 2023 | Mixture-of-ExpertsMulti-Task Learning | —Unverified | 0 | 0 |
| Modularity Matters: Learning Invariant Relational Reasoning Tasks | Jun 18, 2018 | Mixture-of-ExpertsRelational Reasoning | —Unverified | 0 | 0 |
| MoE-AMC: Enhancing Automatic Modulation Classification Performance Using Mixture-of-Experts | Dec 4, 2023 | ClassificationMixture-of-Experts | —Unverified | 0 | 0 |
| MoEBERT: from BERT to Mixture-of-Experts via Importance-Guided Adaptation | Jan 16, 2022 | Knowledge DistillationMixture-of-Experts | —Unverified | 0 | 0 |
| MoE-CAP: Benchmarking Cost, Accuracy and Performance of Sparse Mixture-of-Experts Systems | Dec 10, 2024 | BenchmarkingMixture-of-Experts | —Unverified | 0 | 0 |