SOTAVerified

Mixture-of-Experts

Papers

Showing 581590 of 1312 papers

TitleStatusHype
MoE-LPR: Multilingual Extension of Large Language Models through Mixture-of-Experts with Language Priors RoutingCode0
FedMoE: Personalized Federated Learning via Heterogeneous Mixture of Experts0
KAN4TSF: Are KAN and KAN-based models Effective for Time Series Forecasting?Code2
Navigating Spatio-Temporal Heterogeneity: A Graph Transformer Approach for Traffic ForecastingCode1
HMoE: Heterogeneous Mixture of Experts for Language Modeling0
AnyGraph: Graph Foundation Model in the WildCode3
AdapMoE: Adaptive Sensitivity-based Expert Gating and Management for Efficient MoE InferenceCode1
A Unified Framework for Iris Anti-Spoofing: Introducing IrisGeneral Dataset and Masked-MoE Method0
Customizing Language Models with Instance-wise LoRA for Sequential RecommendationCode1
FEDKIM: Adaptive Federated Knowledge Injection into Medical Foundation ModelsCode0
Show:102550
← PrevPage 59 of 132Next →

No leaderboard results yet.