SOTAVerified

Mixture-of-Experts

Papers

Showing 7180 of 1312 papers

TitleStatusHype
LiMoE: Mixture of LiDAR Representation Learners from Automotive ScenesCode2
Superposition in Transformers: A Novel Way of Building Mixture of ExpertsCode2
ReMoE: Fully Differentiable Mixture-of-Experts with ReLU RoutingCode2
DeMo: Decoupled Feature-Based Mixture of Experts for Multi-Modal Object Re-IdentificationCode2
Towards a Multimodal Large Language Model with Pixel-Level Insight for BiomedicineCode2
Object Detection using Event Camera: A MoE Heat Conduction based Detector and A New Benchmark DatasetCode2
Monet: Mixture of Monosemantic Experts for TransformersCode2
LLaMA-MoE v2: Exploring Sparsity of LLaMA from Perspective of Mixture-of-Experts with Post-TrainingCode2
CNMBERT: A Model for Converting Hanyu Pinyin Abbreviations to Chinese CharactersCode2
SLED: Self Logits Evolution Decoding for Improving Factuality in Large Language ModelsCode2
Show:102550
← PrevPage 8 of 132Next →

No leaderboard results yet.