SOTAVerified

Mixture-of-Experts

Papers

Showing 501525 of 1312 papers

TitleStatusHype
GETS: Ensemble Temperature Scaling for Calibration in Graph Neural Networks0
Retraining-Free Merging of Sparse MoE via Hierarchical ClusteringCode1
Flex-MoE: Modeling Arbitrary Modality Combination via the Flexible Mixture-of-ExpertsCode2
More Experts Than Galaxies: Conditionally-overlapping Experts With Biologically-Inspired Fixed RoutingCode0
Mono-InternVL: Pushing the Boundaries of Monolithic Multimodal Large Language Models with Endogenous Visual Pre-training0
Upcycling Large Language Models into Mixture of Experts0
Efficient Dictionary Learning with Switch Sparse AutoencodersCode1
MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation ExpertsCode4
Functional-level Uncertainty Quantification for Calibrated Fine-tuning on LLMs0
Toward generalizable learning of all (linear) first-order methods via memory augmented Transformers0
MC-MoE: Mixture Compressor for Mixture-of-Experts LLMs Gains MoreCode2
Aria: An Open Multimodal Native Mixture-of-Experts ModelCode5
Probing the Robustness of Theory of Mind in Large Language Models0
Scaling Laws Across Model Architectures: A Comparative Analysis of Dense and MoE Models in Large Language Models0
Model-GLUE: Democratized LLM Scaling for A Large Model Zoo in the WildCode1
Multimodal Fusion Strategies for Mapping Biophysical Landscape FeaturesCode0
Realizing Video Summarization from the Path of Language-based Semantic Understanding0
A Dynamic Approach to Stock Price Prediction: Comparing RNN and Mixture of Experts Models Across Different Volatility Profiles0
Structure-Enhanced Protein Instruction Tuning: Towards General-Purpose Protein Understanding with LLMs0
On Expert Estimation in Hierarchical Mixture of Experts: Beyond Softmax Gating Functions0
MLP-KAN: Unifying Deep Representation and Function LearningCode0
Searching for Efficient Linear Layers over a Continuous Space of Structured MatricesCode1
Efficient Residual Learning with Mixture-of-Experts for Universal Dexterous Grasping0
Revisiting Prefix-tuning: Statistical Benefits of Reparameterization among PromptsCode0
Neutral residues: revisiting adapters for model extension0
Show:102550
← PrevPage 21 of 53Next →

No leaderboard results yet.