SOTAVerified

Mixture-of-Experts

Papers

Showing 476500 of 1312 papers

TitleStatusHype
Finding Fantastic Experts in MoEs: A Unified Study for Expert Dropping Strategies and Observations0
HeterMoE: Efficient Training of Mixture-of-Experts Models on Heterogeneous GPUs0
RingMoE: Mixture-of-Modality-Experts Multi-Modal Foundation Models for Universal Remote Sensing Image Interpretation0
MegaScale-Infer: Serving Mixture-of-Experts at Scale with Disaggregated Expert Parallelism0
Advancing MoE Efficiency: A Collaboration-Constrained Routing (C2R) Strategy for Better Expert Parallelism Design0
A Unified Virtual Mixture-of-Experts Framework:Enhanced Inference and Hallucination Mitigation in Single-Model System0
DynMoLE: Boosting Mixture of LoRA Experts Fine-Tuning with a Hybrid Routing MechanismCode0
Detecting Financial Fraud with Hybrid Deep Learning: A Mix-of-Experts Approach to Sequential and Anomalous Patterns0
Unimodal-driven Distillation in Multimodal Emotion Recognition with Dynamic Fusion0
Mixture of Routers0
S2MoE: Robust Sparse Mixture of Experts via Stochastic Learning0
Sparse Mixture of Experts as Unified Competitive Learning0
Beyond Standard MoE: Mixture of Latent Experts for Resource-Efficient Language Models0
Exploiting Mixture-of-Experts Redundancy Unlocks Multimodal Generative Abilities0
RocketPPA: Code-Level Power, Performance, and Area Prediction via LLM and Mixture of Experts0
LLaVA-CMoE: Towards Continual Mixture of Experts for Large Vision-Language Models0
iMedImage Technical Report0
Reasoning Beyond Limits: Advances and Open Problems for LLMs0
MoLe-VLA: Dynamic Layer-skipping Vision Language Action Model via Mixture-of-Layers for Efficient Robot Manipulation0
Modality-Independent Brain Lesion Segmentation with Privacy-aware Continual LearningCode0
Optimal Scaling Laws for Efficiency Gains in a Theoretical Transformer-Augmented Sectional MoE Framework0
Enhancing Multi-modal Models with Heterogeneous MoE Adapters for Fine-tuning0
A multi-scale lithium-ion battery capacity prediction using mixture of experts and patch-based MLPCode0
M^2CD: A Unified MultiModal Framework for Optical-SAR Change Detection with Mixture of Experts and Self-Distillation0
Resilient Sensor Fusion under Adverse Sensor Failures via Multi-Modal Expert Fusion0
Show:102550
← PrevPage 20 of 53Next →

No leaderboard results yet.