| WDMoE: Wireless Distributed Large Language Models with Mixture of Experts | May 6, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Lory: Fully Differentiable Mixture-of-Experts for Autoregressive Language Model Pre-training | May 6, 2024 | Language ModelingLanguage Modelling | —Unverified | 0 |
| Mixture of partially linear experts | May 5, 2024 | Mixture-of-Experts | —Unverified | 0 |
| MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts | May 2, 2024 | Combinatorial OptimizationMixture-of-Experts | CodeCode Available | 3 |
| Hierarchical mixture of discriminative Generalized Dirichlet classifiers | May 2, 2024 | Mixture-of-ExpertsSpam detection | —Unverified | 0 |
| Powering In-Database Dynamic Model Slicing for Structured Data Analytics | May 1, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Mixture of insighTful Experts (MoTE): The Synergy of Thought Chains and Expert Mixtures in Self-Alignment | May 1, 2024 | Mixture-of-Experts | —Unverified | 0 |
| MoPEFT: A Mixture-of-PEFTs for the Segment Anything Model | May 1, 2024 | Mixture-of-Expertsparameter-efficient fine-tuning | —Unverified | 0 |
| Lancet: Accelerating Mixture-of-Experts Training via Whole Graph Computation-Communication Overlapping | Apr 30, 2024 | AllMixture-of-Experts | —Unverified | 0 |
| Revisiting RGBT Tracking Benchmarks from the Perspective of Modality Validity: A New Benchmark, Problem, and Method | Apr 30, 2024 | Mixture-of-ExpertsRgb-T Tracking | CodeCode Available | 1 |