SOTAVerified

Mixture-of-Experts

Papers

Showing 4150 of 1312 papers

TitleStatusHype
Skywork-MoE: A Deep Dive into Training Techniques for Mixture-of-Experts Language ModelsCode4
Learning Heterogeneous Mixture of Scene Experts for Large-scale Neural Radiance FieldsCode3
MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-ExpertsCode3
Generalizing Motion Planners with Mixture of Experts for Autonomous DrivingCode3
FlashDMoE: Fast Distributed MoE in a Single KernelCode3
Fiddler: CPU-GPU Orchestration for Fast Inference of Mixture-of-Experts ModelsCode3
MixLoRA: Enhancing Large Language Models Fine-Tuning with LoRA-based Mixture of ExpertsCode3
MoAI: Mixture of All Intelligence for Large Language and Vision ModelsCode3
MoE-Mamba: Efficient Selective State Space Models with Mixture of ExpertsCode3
LLaVA-MoD: Making LLaVA Tiny via MoE Knowledge DistillationCode3
Show:102550
← PrevPage 5 of 132Next →

No leaderboard results yet.