| Scaling physics-informed hard constraints with mixture-of-experts | Feb 20, 2024 | Inductive BiasMixture-of-Experts | CodeCode Available | 1 |
| HyperMoE: Towards Better Mixture of Experts via Transferring Among Experts | Feb 20, 2024 | Mixture-of-ExpertsMulti-Task Learning | CodeCode Available | 1 |
| BiMediX: Bilingual Medical Mixture of Experts LLM | Feb 20, 2024 | Mixture-of-ExpertsMultiple-choice | CodeCode Available | 1 |
| Denoising OCT Images Using Steered Mixture of Experts with Multi-Model Inference | Feb 20, 2024 | DenoisingDiagnostic | —Unverified | 0 |
| MoELoRA: Contrastive Learning Guided Mixture of Experts on Parameter-Efficient Fine-Tuning for Large Language Models | Feb 20, 2024 | Common Sense ReasoningContrastive Learning | —Unverified | 0 |
| Towards an empirical understanding of MoE design choices | Feb 20, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Multilinear Mixture of Experts: Scalable Expert Specialization through Factorization | Feb 19, 2024 | Attributecounterfactual | CodeCode Available | 1 |
| Turn Waste into Worth: Rectifying Top-k Router of MoE | Feb 17, 2024 | Computational EfficiencyGPU | —Unverified | 0 |
| MoRAL: MoE Augmented LoRA for LLMs' Lifelong Learning | Feb 17, 2024 | Lifelong learningMixture-of-Experts | —Unverified | 0 |
| AMEND: A Mixture of Experts Framework for Long-tailed Trajectory Prediction | Feb 13, 2024 | Contrastive LearningMixture-of-Experts | —Unverified | 0 |