SOTAVerified

Mixture-of-Experts

Papers

Showing 291300 of 1312 papers

TitleStatusHype
Awaker2.5-VL: Stably Scaling MLLMs with Parameter-Efficient Mixture of ExpertsCode1
Emotion-Qwen: Training Hybrid Experts for Unified Emotion and General Vision-Language UnderstandingCode1
MoGERNN: An Inductive Traffic Predictor for Unobserved Locations in Dynamic Sensing NetworksCode1
Dynamic Data Mixing Maximizes Instruction Tuning for Mixture-of-ExpertsCode1
Dynamic Language Group-Based MoE: Enhancing Code-Switching Speech Recognition with Hierarchical RoutingCode1
MoËT: Mixture of Expert Trees and its Application to Verifiable Reinforcement LearningCode1
DirectMultiStep: Direct Route Generation for Multi-Step RetrosynthesisCode1
Enhancing Fast Feed Forward Networks with Load Balancing and a Master Leaf NodeCode1
MoExtend: Tuning New Experts for Modality and Task ExtensionCode1
Efficient Dictionary Learning with Switch Sparse AutoencodersCode1
Show:102550
← PrevPage 30 of 132Next →

No leaderboard results yet.