| Co-Supervised Learning: Improving Weak-to-Strong Generalization with Hierarchical Mixture of Experts | Feb 23, 2024 | Mixture-of-Experts | CodeCode Available | 0 |
| PEMT: Multi-Task Correlation Guided Mixture-of-Experts Enables Parameter-Efficient Transfer Learning | Feb 23, 2024 | Mixture-of-Expertsparameter-efficient fine-tuning | —Unverified | 0 |
| MoELoRA: Contrastive Learning Guided Mixture of Experts on Parameter-Efficient Fine-Tuning for Large Language Models | Feb 20, 2024 | Common Sense ReasoningContrastive Learning | —Unverified | 0 |
| Denoising OCT Images Using Steered Mixture of Experts with Multi-Model Inference | Feb 20, 2024 | DenoisingDiagnostic | —Unverified | 0 |
| Towards an empirical understanding of MoE design choices | Feb 20, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Turn Waste into Worth: Rectifying Top-k Router of MoE | Feb 17, 2024 | Computational EfficiencyGPU | —Unverified | 0 |
| MoRAL: MoE Augmented LoRA for LLMs' Lifelong Learning | Feb 17, 2024 | Lifelong learningMixture-of-Experts | —Unverified | 0 |
| Mixture of Link Predictors on Graphs | Feb 13, 2024 | Link PredictionMixture-of-Experts | CodeCode Available | 0 |
| AMEND: A Mixture of Experts Framework for Long-tailed Trajectory Prediction | Feb 13, 2024 | Contrastive LearningMixture-of-Experts | —Unverified | 0 |
| P-Mamba: Marrying Perona Malik Diffusion with Mamba for Efficient Pediatric Echocardiographic Left Ventricular Segmentation | Feb 13, 2024 | MambaMixture-of-Experts | —Unverified | 0 |
| Differentially Private Training of Mixture of Experts Models | Feb 11, 2024 | Computational EfficiencyMixture-of-Experts | —Unverified | 0 |
| Buffer Overflow in Mixture of Experts | Feb 8, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Task-customized Masked AutoEncoder via Mixture of Cluster-conditional Experts | Feb 8, 2024 | Mixture-of-ExpertsSelf-Supervised Learning | —Unverified | 0 |
| On Parameter Estimation in Deviated Gaussian Mixture of Experts | Feb 7, 2024 | Mixture-of-Expertsparameter estimation | —Unverified | 0 |
| Intrinsic User-Centric Interpretability through Global Mixture of Experts | Feb 5, 2024 | Mixture-of-ExpertsNews Classification | CodeCode Available | 0 |
| Approximation Rates and VC-Dimension Bounds for (P)ReLU MLP Mixture of Experts | Feb 5, 2024 | GPUMixture-of-Experts | —Unverified | 0 |
| On Least Square Estimation in Softmax Gating Mixture of Experts | Feb 5, 2024 | Mixture-of-Experts | —Unverified | 0 |
| FuseMoE: Mixture-of-Experts Transformers for Fleximodal Fusion | Feb 5, 2024 | Missing ElementsMixture-of-Experts | —Unverified | 0 |
| CompeteSMoE - Effective Training of Sparse Mixture of Experts via Competition | Feb 4, 2024 | Mixture-of-Experts | CodeCode Available | 0 |
| pFedMoE: Data-Level Personalization with Mixture of Experts for Model-Heterogeneous Personalized Federated Learning | Feb 2, 2024 | Federated LearningMixture-of-Experts | CodeCode Available | 0 |
| MoDE: A Mixture-of-Experts Model with Mutual Distillation among the Experts | Jan 31, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Explainable data-driven modeling via mixture of experts: towards effective blending of grey and black-box models | Jan 30, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Checkmating One, by Using Many: Combining Mixture of Experts with MCTS to Improve in Chess | Jan 30, 2024 | Mixture-of-Experts | CodeCode Available | 0 |
| LLaVA-MoLE: Sparse Mixture of LoRA Experts for Mitigating Data Conflicts in Instruction Finetuning MLLMs | Jan 29, 2024 | Language ModellingLarge Language Model | —Unverified | 0 |
| Routers in Vision Mixture of Experts: An Empirical Study | Jan 29, 2024 | Language ModelingLanguage Modelling | —Unverified | 0 |