SOTAVerified

Mixture-of-Experts

Papers

Showing 121130 of 1312 papers

TitleStatusHype
Make LoRA Great Again: Boosting LoRA with Adaptive Singular Values and Mixture-of-Experts Optimization AlignmentCode2
CLIP-MoE: Towards Building Mixture of Experts for CLIP with Diversified Multiplet UpcyclingCode2
LLaMA-MoE v2: Exploring Sparsity of LLaMA from Perspective of Mixture-of-Experts with Post-TrainingCode2
Aurora:Activating Chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-TuningCode2
LiMoE: Mixture of LiDAR Representation Learners from Automotive ScenesCode2
Learning Robust Stereo Matching in the Wild with Selective Mixture-of-ExpertsCode2
CNMBERT: A Model for Converting Hanyu Pinyin Abbreviations to Chinese CharactersCode2
Linear-MoE: Linear Sequence Modeling Meets Mixture-of-ExpertsCode2
MiniDrive: More Efficient Vision-Language Models with Multi-Level 2D Features as Text Tokens for Autonomous DrivingCode2
Open-RAG: Enhanced Retrieval-Augmented Reasoning with Open-Source Large Language ModelsCode2
Show:102550
← PrevPage 13 of 132Next →

No leaderboard results yet.