SOTAVerified

Mixture-of-Experts

Papers

Showing 371380 of 1312 papers

TitleStatusHype
CoLA: Collaborative Low-Rank AdaptationCode0
More Experts Than Galaxies: Conditionally-overlapping Experts With Biologically-Inspired Fixed RoutingCode0
FactorLLM: Factorizing Knowledge via Mixture of Experts for Large Language ModelsCode0
MoE-I^2: Compressing Mixture of Experts Models through Inter-Expert Pruning and Intra-Expert Low-Rank DecompositionCode0
Extreme Classification in Log Memory using Count-Min Sketch: A Case Study of Amazon Search with 50M ProductsCode0
Modality-Independent Brain Lesion Segmentation with Privacy-aware Continual LearningCode0
Cluster-Driven Expert Pruning for Mixture-of-Experts Large Language ModelsCode0
Exploring Model Consensus to Generate Translation ParaphrasesCode0
Exploiting Activation Sparsity with Dense to Dynamic-k Mixture-of-Experts ConversionCode0
MLP-KAN: Unifying Deep Representation and Function LearningCode0
Show:102550
← PrevPage 38 of 132Next →

No leaderboard results yet.