| EVLM: An Efficient Vision-Language Model for Visual Understanding | Jul 19, 2024 | Image CaptioningLanguage Modeling | —Unverified | 0 |
| EvidenceMoE: A Physics-Guided Mixture-of-Experts with Evidential Critics for Advancing Fluorescence Light Detection and Ranging in Scattering Media | May 23, 2025 | Depth EstimationMixture-of-Experts | —Unverified | 0 |
| Every Sample Matters: Leveraging Mixture-of-Experts and High-Quality Data for Efficient and Accurate Code LLM | Mar 22, 2025 | Code GenerationMixture-of-Experts | —Unverified | 0 |
| Every FLOP Counts: Scaling a 300B Mixture-of-Experts LING LLM without Premium GPUs | Mar 7, 2025 | Knowledge GraphsMixture-of-Experts | —Unverified | 0 |
| Non-asymptotic model selection in block-diagonal mixture of polynomial experts models | Apr 18, 2021 | Mixture-of-ExpertsModel Selection | —Unverified | 0 |
| 3D Gaussian Splatting Data Compression with Mixture of Priors | May 6, 2025 | 3DGSData Compression | —Unverified | 0 |
| Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models | Feb 18, 2025 | Knowledge DistillationMixture-of-Experts | —Unverified | 0 |
| EVE: Efficient Vision-Language Pre-training with Masked Prediction and Modality-Aware MoE | Aug 23, 2023 | Image-text matchingImage-text Retrieval | —Unverified | 0 |
| Channel Gain Cartography via Mixture of Experts | Dec 8, 2020 | Mixture-of-Experts | —Unverified | 0 |
| EVA: Mixture-of-Experts Semantic Variant Alignment for Compositional Zero-Shot Learning | Jun 26, 2025 | Compositional Zero-Shot LearningMixture-of-Experts | —Unverified | 0 |