SOTAVerified

Mixture-of-Experts

Papers

Showing 641650 of 1312 papers

TitleStatusHype
LoRA-Switch: Boosting the Efficiency of Dynamic LLM Adapters via System-Algorithm Co-design0
Efficient Mixture-of-Expert for Video-based Driver State and Physiological Multi-task Estimation in Conditional Autonomous Driving0
Massively Multilingual Shallow Fusion with Large Language Models0
LoRA-Mixer: Coordinate Modular LoRA Experts Through Serial Attention Routing0
MAST-Pro: Dynamic Mixture-of-Experts for Adaptive Segmentation of Pan-Tumors with Knowledge-Driven Prompts0
Efficient and Effective Weight-Ensembling Mixture of Experts for Multi-Task Model Merging0
Adaptive Segmentation-Based Initialization for Steered Mixture of Experts Image Regression0
Boosting Code-Switching ASR with Mixture of Experts Enhanced Speech-Conditioned LLM0
Mean-field limit from general mixtures of experts to quantum neural networks0
Locking and Quacking: Stacking Bayesian model predictions by log-pooling and superposition0
Show:102550
← PrevPage 65 of 132Next →

No leaderboard results yet.