SOTAVerified

Mixture-of-Experts

Papers

Showing 201210 of 1312 papers

TitleStatusHype
MeteoRA: Multiple-tasks Embedded LoRA for Large Language ModelsCode1
Condense, Don't Just Prune: Enhancing Efficiency and Performance in MoE Layer PruningCode1
Image Super-resolution Via Latent Diffusion: A Sampling-space Mixture Of Experts And Frequency-augmented Decoder ApproachCode1
HyperRouter: Towards Efficient Training and Inference of Sparse Mixture of ExpertsCode1
HyperMoE: Towards Better Mixture of Experts via Transferring Among ExpertsCode1
Jakiro: Boosting Speculative Decoding with Decoupled Multi-Head via MoECode1
HydraSum: Disentangling Stylistic Features in Text Summarization using Multi-Decoder ModelsCode1
Hierarchical Time-Aware Mixture of Experts for Multi-Modal Sequential RecommendationCode1
Mixture of Decision Trees for Interpretable Machine LearningCode1
HyperFormer: Enhancing Entity and Relation Interaction for Hyper-Relational Knowledge Graph CompletionCode1
Show:102550
← PrevPage 21 of 132Next →

No leaderboard results yet.