SOTAVerified

Mixture-of-Experts

Papers

Showing 2650 of 1312 papers

TitleStatusHype
LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-trainingCode5
DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language ModelsCode5
Chatlaw: A Multi-Agent Collaborative Legal Assistant with Knowledge Graph Enhanced Mixture-of-Experts Large Language ModelCode5
OpenMoE: An Early Effort on Open Mixture-of-Experts Language ModelsCode5
DeepSpeed Inference: Enabling Efficient Inference of Transformer Models at Unprecedented ScaleCode4
JetMoE: Reaching Llama2 Performance with 0.1M DollarsCode4
OLMoE: Open Mixture-of-Experts Language ModelsCode4
MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation ExpertsCode4
Training Sparse Mixture Of Experts Text Embedding ModelsCode4
Gated Attention for Large Language Models: Non-linearity, Sparsity, and Attention-Sink-FreeCode4
Mixtral of ExpertsCode4
Fast Inference of Mixture-of-Experts Language Models with OffloadingCode4
Skywork-MoE: A Deep Dive into Training Techniques for Mixture-of-Experts Language ModelsCode4
Let the Expert Stick to His Last: Expert-Specialized Fine-Tuning for Sparse Architectural Large Language ModelsCode4
MoH: Multi-Head Attention as Mixture-of-Head AttentionCode4
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of ExpertsCode4
BlackMamba: Mixture of Experts for State-Space ModelsCode3
Learning Heterogeneous Mixture of Scene Experts for Large-scale Neural Radiance FieldsCode3
Boosting Continual Learning of Vision-Language Models via Mixture-of-Experts AdaptersCode3
MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-ExpertsCode3
A Survey on Mixture of ExpertsCode3
A Survey on Inference Optimization Techniques for Mixture of Experts ModelsCode3
AnyGraph: Graph Foundation Model in the WildCode3
Generalizing Motion Planners with Mixture of Experts for Autonomous DrivingCode3
MoAI: Mixture of All Intelligence for Large Language and Vision ModelsCode3
Show:102550
← PrevPage 2 of 53Next →

No leaderboard results yet.