SOTAVerified

Mixture-of-Experts

Papers

Showing 911920 of 1312 papers

TitleStatusHype
Fast Feedforward NetworksCode2
Motion In-Betweening with Phase ManifoldsCode2
Pre-gated MoE: An Algorithm-System Co-Design for Fast and Scalable Mixture-of-Expert InferenceCode1
EVE: Efficient Vision-Language Pre-training with Masked Prediction and Modality-Aware MoE0
Enhancing NeRF akin to Enhancing LLMs: Generalizable NeRF Transformer with Mixture-of-View-ExpertsCode1
Beyond Sharing: Conflict-Aware Multivariate Time Series Anomaly DetectionCode0
FineQuant: Unlocking Efficiency with Fine-Grained Weight-Only Quantization for LLMs0
HyperFormer: Enhancing Entity and Relation Interaction for Hyper-Relational Knowledge Graph CompletionCode1
Experts Weights Averaging: A New General Training Scheme for Vision Transformers0
A Novel Temporal Multi-Gate Mixture-of-Experts Approach for Vehicle Trajectory and Driving Intention Prediction0
Show:102550
← PrevPage 92 of 132Next →

No leaderboard results yet.