SOTAVerified

Mixture-of-Experts

Papers

Showing 11511160 of 1312 papers

TitleStatusHype
MoE-MLoRA for Multi-Domain CTR Prediction: Efficient Adaptation with Expert SpecializationCode0
Checkmating One, by Using Many: Combining Mixture of Experts with MCTS to Improve in ChessCode0
MoE-LPR: Multilingual Extension of Large Language Models through Mixture-of-Experts with Language Priors RoutingCode0
Subjective and Objective Analysis of Indian Social Media Video QualityCode0
Sub-MoE: Efficient Mixture-of-Expert LLMs Compression via Subspace Expert MergingCode0
Nesti-Net: Normal Estimation for Unstructured 3D Point Clouds using Convolutional Neural NetworksCode0
Catching Attention with Automatic Pull Quote SelectionCode0
EAQuant: Enhancing Post-Training Quantization for MoE Models via Expert-Aware OptimizationCode0
DynMoLE: Boosting Mixture of LoRA Experts Fine-Tuning with a Hybrid Routing MechanismCode0
Hierarchical Mixture of Experts: Generalizable Learning for High-Level SynthesisCode0
Show:102550
← PrevPage 116 of 132Next →

No leaderboard results yet.