| Approximation Rates and VC-Dimension Bounds for (P)ReLU MLP Mixture of Experts | Feb 5, 2024 | GPUMixture-of-Experts | —Unverified | 0 |
| On Least Square Estimation in Softmax Gating Mixture of Experts | Feb 5, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Intrinsic User-Centric Interpretability through Global Mixture of Experts | Feb 5, 2024 | Mixture-of-ExpertsNews Classification | CodeCode Available | 0 |
| FuseMoE: Mixture-of-Experts Transformers for Fleximodal Fusion | Feb 5, 2024 | Missing ElementsMixture-of-Experts | —Unverified | 0 |
| CompeteSMoE - Effective Training of Sparse Mixture of Experts via Competition | Feb 4, 2024 | Mixture-of-Experts | CodeCode Available | 0 |
| pFedMoE: Data-Level Personalization with Mixture of Experts for Model-Heterogeneous Personalized Federated Learning | Feb 2, 2024 | Federated LearningMixture-of-Experts | CodeCode Available | 0 |
| BlackMamba: Mixture of Experts for State-Space Models | Feb 1, 2024 | Language ModelingLanguage Modelling | CodeCode Available | 3 |
| Efficient Fine-tuning of Audio Spectrogram Transformers via Soft Mixture of Adapters | Feb 1, 2024 | Mixture-of-Expertsparameter-efficient fine-tuning | CodeCode Available | 1 |
| Merging Multi-Task Models via Weight-Ensembling Mixture of Experts | Feb 1, 2024 | Mixture-of-ExpertsTask Arithmetic | CodeCode Available | 1 |
| MoDE: A Mixture-of-Experts Model with Mutual Distillation among the Experts | Jan 31, 2024 | Mixture-of-Experts | —Unverified | 0 |