SOTAVerified

Mixture-of-Experts

Papers

Showing 6170 of 1312 papers

TitleStatusHype
Reservoir History Matching of the Norne field with generative exotic priors and a coupled Mixture of Experts -- Physics Informed Neural Operator Forward ModelCode3
Mixture of A Million ExpertsCode2
Efficiently Democratizing Medical LLMs for 50 Languages via a Mixture of Language Family ExpertsCode2
Mixture of Lookup ExpertsCode2
MDFEND: Multi-domain Fake News DetectionCode2
Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer ModelsCode2
Make LoRA Great Again: Boosting LoRA with Adaptive Singular Values and Mixture-of-Experts Optimization AlignmentCode2
Dynamic Tuning Towards Parameter and Inference Efficiency for ViT AdaptationCode2
MC-MoE: Mixture Compressor for Mixture-of-Experts LLMs Gains MoreCode2
MiniDrive: More Efficient Vision-Language Models with Multi-Level 2D Features as Text Tokens for Autonomous DrivingCode2
Show:102550
← PrevPage 7 of 132Next →

No leaderboard results yet.