SOTAVerified

Mixture-of-Experts

Papers

Showing 131140 of 1312 papers

TitleStatusHype
MxMoE: Mixed-precision Quantization for MoE with Accuracy and Performance Co-DesignCode1
Divide-and-Conquer: Cold-Start Bundle Recommendation via Mixture of Diffusion Experts0
SToLa: Self-Adaptive Touch-Language Framework with Tactile Commonsense Reasoning in Open-Ended Scenarios0
Pangu Ultra MoE: How to Train Your Big MoE on Ascend NPUs0
LLM-e Guess: Can LLMs Capabilities Advance Without Hardware Progress?Code0
STAR-Rec: Making Peace with Length Variance and Pattern Diversity in Sequential Recommendation0
Faster MoE LLM Inference for Extremely Large Models0
Towards Smart Point-and-Shoot Photography0
3D Gaussian Splatting Data Compression with Mixture of Priors0
Multimodal Deep Learning-Empowered Beam Prediction in Future THz ISAC Systems0
Show:102550
← PrevPage 14 of 132Next →

No leaderboard results yet.