SOTAVerified

Mixture-of-Experts

Papers

Showing 191200 of 1312 papers

TitleStatusHype
M3-Jepa: Multimodal Alignment via Multi-directional MoE based on the JEPA frameworkCode1
Gradient-free variational learning with conditional mixture networksCode1
Navigating Spatio-Temporal Heterogeneity: A Graph Transformer Approach for Traffic ForecastingCode1
AdapMoE: Adaptive Sensitivity-based Expert Gating and Management for Efficient MoE InferenceCode1
Customizing Language Models with Instance-wise LoRA for Sequential RecommendationCode1
Layerwise Recurrent Router for Mixture-of-ExpertsCode1
AquilaMoE: Efficient Training for MoE Models with Scale-Up and Scale-Out StrategiesCode1
MoExtend: Tuning New Experts for Modality and Task ExtensionCode1
Dynamic Language Group-Based MoE: Enhancing Code-Switching Speech Recognition with Hierarchical RoutingCode1
M4: Multi-Proxy Multi-Gate Mixture of Experts Network for Multiple Instance Learning in Histopathology Image AnalysisCode1
Show:102550
← PrevPage 20 of 132Next →

No leaderboard results yet.