SOTAVerified

Mixture-of-Experts

Papers

Showing 91100 of 1312 papers

TitleStatusHype
Time Tracker: Mixture-of-Experts-Enhanced Foundation Time Series Forecasting Model with Decoupled Training Pipelines0
MoRE-Brain: Routed Mixture of Experts for Interpretable and Generalizable Cross-Subject fMRI Visual DecodingCode0
Hunyuan-TurboS: Advancing Large Language Models through Mamba-Transformer Synergy and Adaptive Chain-of-Thought0
Efficient Data Driven Mixture-of-Expert Extraction from Trained Networks0
Multimodal Cultural Safety: Evaluation Frameworks and Alignment StrategiesCode0
Balanced and Elastic End-to-end Training of Dynamic LLMs0
Multimodal Mixture of Low-Rank Experts for Sentiment Analysis and Emotion Recognition0
THOR-MoE: Hierarchical Task-Guided and Context-Responsive Routing for Neural Machine Translation0
Two Experts Are All You Need for Steering Thinking: Reinforcing Cognitive Effort in MoE Reasoning Models Without Additional Training0
FuxiMT: Sparsifying Large Language Models for Chinese-Centric Multilingual Machine Translation0
Show:102550
← PrevPage 10 of 132Next →

No leaderboard results yet.