SOTAVerified

Mixture-of-Experts

Papers

Showing 751775 of 1312 papers

TitleStatusHype
Upcycling Instruction Tuning from Dense to Mixture-of-Experts via Parameter Merging0
EC-DIT: Scaling Diffusion Transformers with Adaptive Expert-Choice Routing0
UniAdapt: A Universal Adapter for Knowledge Calibration0
MoS: Unleashing Parameter Efficiency of Low-Rank Adaptation with Mixture of Shards0
Robust Traffic Forecasting against Spatial Shift over YearsCode0
MM1.5: Methods, Analysis & Insights from Multimodal LLM Fine-tuning0
IDEA: An Inverse Domain Expert Adaptation Based Active DNN IP Protection Method0
SciDFM: A Large Language Model with Mixture-of-Experts for Science0
Toward Mixture-of-Experts Enabled Trustworthy Semantic Communication for 6G Networks0
Boosting Code-Switching ASR with Mixture of Experts Enhanced Speech-Conditioned LLM0
Leveraging Mixture of Experts for Improved Speech Deepfake Detection0
Multi-Modal Generative AI: Multi-modal LLM, Diffusion and Beyond0
A Gated Residual Kolmogorov-Arnold Networks for Mixtures of ExpertsCode0
Routing in Sparsely-gated Language Models responds to Context0
Multi-omics data integration for early diagnosis of hepatocellular carcinoma (HCC) using machine learning0
On-Device Collaborative Language Modeling via a Mixture of Generalists and SpecialistsCode0
Robust Audiovisual Speech Recognition Models with Mixture-of-Experts0
Mixture of Diverse Size Experts0
GRIN: GRadient-INformed MoE0
LPT++: Efficient Training on Mixture of Long-tailed Experts0
Adaptive Segmentation-Based Initialization for Steered Mixture of Experts Image Regression0
Integrating AI's Carbon Footprint into Risk Management Frameworks: Strategies and Tools for Sustainable Compliance in Banking Sector0
STUN: Structured-Then-Unstructured Pruning for Scalable MoE Pruning0
VE: Modeling Multivariate Time Series Correlation with Variate EmbeddingCode0
DA-MoE: Towards Dynamic Expert Allocation for Mixture-of-Experts Models0
Show:102550
← PrevPage 31 of 53Next →

No leaderboard results yet.