SOTAVerified

Mixture-of-Experts

Papers

Showing 251275 of 1312 papers

TitleStatusHype
FreqMoE: Enhancing Time Series Forecasting through Frequency Decomposition Mixture of ExpertsCode1
Specialized federated learning using a mixture of expertsCode1
Multilinear Mixture of Experts: Scalable Expert Specialization through FactorizationCode1
Multimodal Variational Autoencoders for Semi-Supervised Learning: In Defense of Product-of-ExpertsCode1
MomentumSMoE: Integrating Momentum into Sparse Mixture of ExpertsCode1
BiMediX: Bilingual Medical Mixture of Experts LLMCode1
Exploring Sparse MoE in GANs for Text-conditioned Image SynthesisCode1
MoGERNN: An Inductive Traffic Predictor for Unobserved Locations in Dynamic Sensing NetworksCode1
Frequency-Adaptive Pan-Sharpening with Mixture of ExpertsCode1
MoExtend: Tuning New Experts for Modality and Task ExtensionCode1
Multi-Head Mixture-of-ExpertsCode1
EWMoE: An effective model for global weather forecasting with mixture-of-expertsCode1
Enhancing NeRF akin to Enhancing LLMs: Generalizable NeRF Transformer with Mixture-of-View-ExpertsCode1
Enhancing Fast Feed Forward Networks with Load Balancing and a Master Leaf NodeCode1
Examining Post-Training Quantization for Mixture-of-Experts: A BenchmarkCode1
MoËT: Mixture of Expert Trees and its Application to Verifiable Reinforcement LearningCode1
Emergent Modularity in Pre-trained TransformersCode1
Emotion-Qwen: Training Hybrid Experts for Unified Emotion and General Vision-Language UnderstandingCode1
MoEBERT: from BERT to Mixture-of-Experts via Importance-Guided AdaptationCode1
Distribution-aware Forgetting Compensation for Exemplar-Free Lifelong Person Re-identificationCode1
XMoE: Sparse Models with Fine-grained and Adaptive Expert SelectionCode1
MoEDiff-SR: Mixture of Experts-Guided Diffusion Model for Region-Adaptive MRI Super-ResolutionCode1
Distilling the Knowledge in a Neural NetworkCode1
Efficient and Degradation-Adaptive Network for Real-World Image Super-ResolutionCode1
Awaker2.5-VL: Stably Scaling MLLMs with Parameter-Efficient Mixture of ExpertsCode1
Show:102550
← PrevPage 11 of 53Next →

No leaderboard results yet.