SOTAVerified

Mixture-of-Experts

Papers

Showing 591600 of 1312 papers

TitleStatusHype
Integrating Multi-view Analysis: Multi-view Mixture-of-Expert for Textual Personality DetectionCode0
FactorLLM: Factorizing Knowledge via Mixture of Experts for Large Language ModelsCode0
BAM! Just Like That: Simple and Efficient Parameter Upcycling for Mixture of Experts0
A Survey on Model MoErging: Recycling and Routing Among Specialized Experts for Collaborative Learning0
AquilaMoE: Efficient Training for MoE Models with Scale-Up and Scale-Out StrategiesCode1
Layerwise Recurrent Router for Mixture-of-ExpertsCode1
HoME: Hierarchy of Multi-Gate Experts for Multi-Task Learning at Kuaishou0
Understanding the Performance and Estimating the Cost of LLM Fine-TuningCode0
LaDiMo: Layer-wise Distillation Inspired MoEfier0
MoC-System: Efficient Fault Tolerance for Sparse Mixture-of-Experts Model Training0
Show:102550
← PrevPage 60 of 132Next →

No leaderboard results yet.