SOTAVerified

Mixture-of-Experts

Papers

Showing 4150 of 1312 papers

TitleStatusHype
DeepSpeed Inference: Enabling Efficient Inference of Transformer Models at Unprecedented ScaleCode4
FlashDMoE: Fast Distributed MoE in a Single KernelCode3
Learning Heterogeneous Mixture of Scene Experts for Large-scale Neural Radiance FieldsCode3
A Survey on Inference Optimization Techniques for Mixture of Experts ModelsCode3
Generalizing Motion Planners with Mixture of Experts for Autonomous DrivingCode3
LLaVA-MoD: Making LLaVA Tiny via MoE Knowledge DistillationCode3
AnyGraph: Graph Foundation Model in the WildCode3
YourMT3+: Multi-instrument Music Transcription with Enhanced Transformer Architectures and Cross-dataset Stem AugmentationCode3
A Survey on Mixture of ExpertsCode3
Reservoir History Matching of the Norne field with generative exotic priors and a coupled Mixture of Experts -- Physics Informed Neural Operator Forward ModelCode3
Show:102550
← PrevPage 5 of 132Next →

No leaderboard results yet.