SOTAVerified

Mixture-of-Experts

Papers

Showing 476500 of 1312 papers

TitleStatusHype
Federated Mixture of Experts0
Hierarchical Mixture-of-Experts Model for Large-Scale Gaussian Process Regression0
Deep Gaussian Covariance Network0
Federated learning using mixture of experts0
Combining Spectral and Self-Supervised Features for Low Resource Speech Recognition and Translation0
FEAMOE: Fair, Explainable and Adaptive Mixture of Experts0
Combining Parametric and Nonparametric Models for Off-Policy Evaluation0
FaVChat: Unlocking Fine-Grained Facail Video Understanding with Multimodal Large Language Models0
HMOE: Hypernetwork-based Mixture of Experts for Domain Generalization0
Combinations of Adaptive Filters0
Holistic Capability Preservation: Towards Compact Yet Comprehensive Reasoning Models0
Aphasic Speech Recognition using a Mixture of Speech Intelligibility Experts0
A Dynamic Approach to Stock Price Prediction: Comparing RNN and Mixture of Experts Models Across Different Volatility Profiles0
LaDiMo: Layer-wise Distillation Inspired MoEfier0
How Do Consumers Really Choose: Exposing Hidden Preferences with the Mixture of Experts Model0
La-SoftMoE CLIP for Unified Physical-Digital Face Attack Detection0
How Lightweight Can A Vision Transformer Be0
Learning Heterogeneous Tissues with Mixture of Experts for Gigapixel Whole Slide Images0
Lifelong Knowledge Editing for Vision Language Models with Low-Rank Mixture-of-Experts0
Hunyuan-TurboS: Advancing Large Language Models through Mamba-Transformer Synergy and Adaptive Chain-of-Thought0
Faster MoE LLM Inference for Extremely Large Models0
Faster Language Models with Better Multi-Token Prediction Using Tensor Decomposition0
CoCoAFusE: Beyond Mixtures of Experts via Model Fusion0
Fast, Differentiable and Sparse Top-k: a Convex Analysis Perspective0
An Unsupervised Domain Adaptation Method for Locating Manipulated Region in partially fake Audio0
Show:102550
← PrevPage 20 of 53Next →

No leaderboard results yet.