| MoE-MLoRA for Multi-Domain CTR Prediction: Efficient Adaptation with Expert Specialization | Jun 9, 2025 | Click-Through Rate PredictionDiversity | CodeCode Available | 0 |
| Checkmating One, by Using Many: Combining Mixture of Experts with MCTS to Improve in Chess | Jan 30, 2024 | Mixture-of-Experts | CodeCode Available | 0 |
| MoE-LPR: Multilingual Extension of Large Language Models through Mixture-of-Experts with Language Priors Routing | Aug 21, 2024 | Mixture-of-Experts | CodeCode Available | 0 |
| Subjective and Objective Analysis of Indian Social Media Video Quality | Jan 5, 2024 | Mixture-of-ExpertsVisual Question Answering (VQA) | CodeCode Available | 0 |
| Sub-MoE: Efficient Mixture-of-Expert LLMs Compression via Subspace Expert Merging | Jun 29, 2025 | Inference OptimizationMixture-of-Experts | CodeCode Available | 0 |
| Nesti-Net: Normal Estimation for Unstructured 3D Point Clouds using Convolutional Neural Networks | Dec 3, 2018 | Mixture-of-ExpertsSurface Normals Estimation | CodeCode Available | 0 |
| Catching Attention with Automatic Pull Quote Selection | May 27, 2020 | ArticlesMixture-of-Experts | CodeCode Available | 0 |
| EAQuant: Enhancing Post-Training Quantization for MoE Models via Expert-Aware Optimization | Jun 16, 2025 | Mixture-of-ExpertsModel Compression | CodeCode Available | 0 |
| DynMoLE: Boosting Mixture of LoRA Experts Fine-Tuning with a Hybrid Routing Mechanism | Apr 1, 2025 | Common Sense ReasoningComputational Efficiency | CodeCode Available | 0 |
| Hierarchical Mixture of Experts: Generalizable Learning for High-Level Synthesis | Oct 25, 2024 | High-Level SynthesisMixture-of-Experts | CodeCode Available | 0 |