SOTAVerified

Mixture-of-Experts

Papers

Showing 701750 of 1312 papers

TitleStatusHype
Steered Mixture of Experts Regression for Image Denoising with Multi-Model-Inference0
Steps are all you need: Rethinking STEM Education with Prompt Engineering0
Stereo-Talker: Audio-driven 3D Human Synthesis with Prior-Guided Mixture-of-Experts0
ST-ExpertNet: A Deep Expert Framework for Traffic Prediction0
SToLa: Self-Adaptive Touch-Language Framework with Tactile Commonsense Reasoning in Open-Ended Scenarios0
StPR: Spatiotemporal Preservation and Routing for Exemplar-Free Video Class-Incremental Learning0
Strength in Numbers: Averaging and Clustering Effects in Mixture of Experts for Graph-Based Dependency Parsing0
Structure-Enhanced Protein Instruction Tuning: Towards General-Purpose Protein Understanding with LLMs0
STUN: Structured-Then-Unstructured Pruning for Scalable MoE Pruning0
Style Mixture of Experts for Expressive Text-To-Speech Synthesis0
Stylistic Variation in Social Media Part-of-Speech Tagging0
SummaReranker: A Multi-Task Mixture-of-Experts Re-ranking Framework for Abstractive Summarization0
SUTRA: Scalable Multilingual Language Model Architecture0
Symbolic Mixture-of-Experts: Adaptive Skill-based Routing for Heterogeneous Reasoning0
Tabby: Tabular Data Synthesis with Language Models0
Table-based Fact Verification with Self-adaptive Mixture of Experts0
Table-based Fact Verification with Self-labeled Keypoint Alignment0
TAL: Two-stream Adaptive Learning for Generalizable Person Re-identification0
Task-Based MoE for Multitask Multilingual Machine Translation0
Task-customized Masked AutoEncoder via Mixture of Cluster-conditional Experts0
Task-Specific Expert Pruning for Sparse Mixture-of-Experts0
Team Deep Mixture of Experts for Distributed Power Control0
Terminating Differentiable Tree Experts0
The Empirical Impact of Reducing Symmetries on the Performance of Deep Ensembles and MoE0
The Labyrinth of Links: Navigating the Associative Maze of Multi-modal LLMs0
Theory of Mixture-of-Experts for Mobile Edge Computing0
Theory on Mixture-of-Experts in Continual Learning0
The power of fine-grained experts: Granularity boosts expressivity in Mixture of Experts0
The Ultimate Guide to Fine-Tuning LLMs from Basics to Breakthroughs: An Exhaustive Review of Technologies, Research, Best Practices, Applied Research Challenges and Opportunities0
THOR-MoE: Hierarchical Task-Guided and Context-Responsive Routing for Neural Machine Translation0
Time series forecasting with high stakes: A field study of the air cargo industry0
Time Tracker: Mixture-of-Experts-Enhanced Foundation Time Series Forecasting Model with Decoupled Training Pipelines0
Tiny-Attention Adapter: Contexts Are More Important Than the Number of Parameters0
TMoE-P: Towards the Pareto Optimum for Multivariate Soft Sensors0
ToMoE: Converting Dense Large Language Models to Mixture-of-Experts through Dynamic Structural Pruning0
Topic Compositional Neural Language Model0
To Repeat or Not To Repeat: Insights from Scaling LLM under Token-Crisis0
Toward Mixture-of-Experts Enabled Trustworthy Semantic Communication for 6G Networks0
Towards 3D Acceleration for low-power Mixture-of-Experts and Multi-Head Attention Spiking Transformers0
Towards A Better Metric for Text-to-Video Generation0
Towards an empirical understanding of MoE design choices0
Towards A Unified View of Sparse Feed-Forward Network in Pretraining Large Language Model0
Towards Convergence Rates for Parameter Estimation in Gaussian-gated Mixture of Experts0
Towards Efficient Foundation Model for Zero-shot Amodal Segmentation0
Towards Efficient Single Image Dehazing and Desnowing0
Towards Foundational Models for Dynamical System Reconstruction: Hierarchical Meta-Learning via Mixture of Experts0
Towards Lightweight Neural Animation : Exploration of Neural Network Pruning in Mixture of Experts-based Animation Models0
Towards MoE Deployment: Mitigating Inefficiencies in Mixture-of-Expert (MoE) Inference0
Towards Personalized Federated Multi-Scenario Multi-Task Recommendation0
Towards Smart Point-and-Shoot Photography0
Show:102550
← PrevPage 15 of 27Next →

No leaderboard results yet.