| Efficient Residual Learning with Mixture-of-Experts for Universal Dexterous Grasping | Oct 3, 2024 | GPUMixture-of-Experts | —Unverified | 0 | 0 |
| Efficient Training of Large-Scale AI Models Through Federated Mixture-of-Experts: A System-Level Approach | Jul 8, 2025 | Edge-computingFederated Learning | —Unverified | 0 | 0 |
| eMoE: Task-aware Memory Efficient Mixture-of-Experts-Based (MoE) Model Inference | Mar 10, 2025 | Mixture-of-ExpertsScheduling | —Unverified | 0 | 0 |
| ENACT-Heart -- ENsemble-based Assessment Using CNN and Transformer on Heart Sounds | Feb 24, 2025 | DiagnosticMixture-of-Experts | —Unverified | 0 | 0 |
| Enhancing Code-Switching ASR Leveraging Non-Peaky CTC Loss and Deep Language Posterior Injection | Nov 26, 2024 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 | 0 |
| Enhancing Code-Switching Speech Recognition with LID-Based Collaborative Mixture of Experts Model | Sep 3, 2024 | Language IdentificationMixture-of-Experts | —Unverified | 0 | 0 |
| Enhancing Generalization in Sparse Mixture of Experts Models: The Case for Increased Expert Activation in Compositional Tasks | Oct 17, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Enhancing Healthcare Recommendation Systems with a Multimodal LLMs-based MOE Architecture | Dec 16, 2024 | Mixture-of-ExpertsRecommendation Systems | —Unverified | 0 | 0 |
| Enhancing Multimodal Continual Instruction Tuning with BranchLoRA | May 31, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Enhancing Multi-modal Models with Heterogeneous MoE Adapters for Fine-tuning | Mar 26, 2025 | Mixture-of-Expertsparameter-efficient fine-tuning | —Unverified | 0 | 0 |