| Enhancing Code-Switching Speech Recognition with LID-Based Collaborative Mixture of Experts Model | Sep 3, 2024 | Language IdentificationMixture-of-Experts | —Unverified | 0 |
| Enhancing Generalization in Sparse Mixture of Experts Models: The Case for Increased Expert Activation in Compositional Tasks | Oct 17, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Enhancing Healthcare Recommendation Systems with a Multimodal LLMs-based MOE Architecture | Dec 16, 2024 | Mixture-of-ExpertsRecommendation Systems | —Unverified | 0 |
| Enhancing Multimodal Continual Instruction Tuning with BranchLoRA | May 31, 2025 | Mixture-of-Experts | —Unverified | 0 |
| Enhancing Multi-modal Models with Heterogeneous MoE Adapters for Fine-tuning | Mar 26, 2025 | Mixture-of-Expertsparameter-efficient fine-tuning | —Unverified | 0 |
| Enhancing the "Immunity" of Mixture-of-Experts Networks for Adversarial Defense | Feb 29, 2024 | Adversarial DefenseAdversarial Robustness | —Unverified | 0 |
| Ensemble Learning for Large Language Models in Text and Code Generation: A Survey | Mar 13, 2025 | Code GenerationEnsemble Learning | —Unverified | 0 |
| EPS-MoE: Expert Pipeline Scheduler for Cost-Efficient MoE Inference | Oct 16, 2024 | Computational EfficiencyLarge Language Model | —Unverified | 0 |
| Evaluating Expert Contributions in a MoE LLM for Quiz-Based Tasks | Feb 24, 2025 | Mixture-of-ExpertsMMLU | —Unverified | 0 |
| EVA: Mixture-of-Experts Semantic Variant Alignment for Compositional Zero-Shot Learning | Jun 26, 2025 | Compositional Zero-Shot LearningMixture-of-Experts | —Unverified | 0 |