SOTAVerified

Mixture-of-Experts

Papers

Showing 521530 of 1312 papers

TitleStatusHype
MLP-KAN: Unifying Deep Representation and Function LearningCode0
Revisiting Prefix-tuning: Statistical Benefits of Reparameterization among PromptsCode0
Efficient Residual Learning with Mixture-of-Experts for Universal Dexterous Grasping0
Searching for Efficient Linear Layers over a Continuous Space of Structured MatricesCode1
Neutral residues: revisiting adapters for model extension0
EC-DIT: Scaling Diffusion Transformers with Adaptive Expert-Choice Routing0
Upcycling Instruction Tuning from Dense to Mixture-of-Experts via Parameter Merging0
The Labyrinth of Links: Navigating the Associative Maze of Multi-modal LLMs0
Open-RAG: Enhanced Retrieval-Augmented Reasoning with Open-Source Large Language ModelsCode2
UniAdapt: A Universal Adapter for Knowledge Calibration0
Show:102550
← PrevPage 53 of 132Next →

No leaderboard results yet.