| MoE-MLoRA for Multi-Domain CTR Prediction: Efficient Adaptation with Expert Specialization | Jun 9, 2025 | Click-Through Rate PredictionDiversity | CodeCode Available | 0 |
| Checkmating One, by Using Many: Combining Mixture of Experts with MCTS to Improve in Chess | Jan 30, 2024 | Mixture-of-Experts | CodeCode Available | 0 |
| MoE-LPR: Multilingual Extension of Large Language Models through Mixture-of-Experts with Language Priors Routing | Aug 21, 2024 | Mixture-of-Experts | CodeCode Available | 0 |
| Subjective and Objective Analysis of Indian Social Media Video Quality | Jan 5, 2024 | Mixture-of-ExpertsVisual Question Answering (VQA) | CodeCode Available | 0 |
| Sub-MoE: Efficient Mixture-of-Expert LLMs Compression via Subspace Expert Merging | Jun 29, 2025 | Inference OptimizationMixture-of-Experts | CodeCode Available | 0 |
| Nesti-Net: Normal Estimation for Unstructured 3D Point Clouds using Convolutional Neural Networks | Dec 3, 2018 | Mixture-of-ExpertsSurface Normals Estimation | CodeCode Available | 0 |
| Catching Attention with Automatic Pull Quote Selection | May 27, 2020 | ArticlesMixture-of-Experts | CodeCode Available | 0 |
| EAQuant: Enhancing Post-Training Quantization for MoE Models via Expert-Aware Optimization | Jun 16, 2025 | Mixture-of-ExpertsModel Compression | CodeCode Available | 0 |
| DynMoLE: Boosting Mixture of LoRA Experts Fine-Tuning with a Hybrid Routing Mechanism | Apr 1, 2025 | Common Sense ReasoningComputational Efficiency | CodeCode Available | 0 |
| Hierarchical Mixture of Experts: Generalizable Learning for High-Level Synthesis | Oct 25, 2024 | High-Level SynthesisMixture-of-Experts | CodeCode Available | 0 |
| A Mixture-of-Experts Model for Antonym-Synonym Discrimination | Aug 1, 2021 | Mixture-of-Experts | CodeCode Available | 0 |
| Hierarchical Deep Recurrent Architecture for Video Understanding | Jul 11, 2017 | ClassificationGeneral Classification | CodeCode Available | 0 |
| Swift Hydra: Self-Reinforcing Generative Framework for Anomaly Detection with Multiple Mamba Models | Mar 9, 2025 | Anomaly DetectionMamba | CodeCode Available | 0 |
| DutyTTE: Deciphering Uncertainty in Origin-Destination Travel Time Estimation | Aug 23, 2024 | Deep Reinforcement LearningMixture-of-Experts | CodeCode Available | 0 |
| DSelect-k: Differentiable Selection in the Mixture of Experts with Applications to Multi-Task Learning | Jun 7, 2021 | Mixture-of-ExpertsMulti-Task Learning | CodeCode Available | 0 |
| Domain-Agnostic Neural Architecture for Class Incremental Continual Learning in Document Processing Platform | Jul 11, 2023 | Continual LearningMixture-of-Experts | CodeCode Available | 0 |
| MoE-I^2: Compressing Mixture of Experts Models through Inter-Expert Pruning and Intra-Expert Low-Rank Decomposition | Nov 1, 2024 | Mixture-of-Experts | CodeCode Available | 0 |
| Non-Normal Mixtures of Experts | Jun 22, 2015 | ClusteringMixture-of-Experts | CodeCode Available | 0 |
| Modeling Task Relationships in Multi-task Learning with Multi-gate Mixture-of-Experts | Jul 19, 2018 | Binary ClassificationClick-Through Rate Prediction | CodeCode Available | 0 |
| Modality-Independent Brain Lesion Segmentation with Privacy-aware Continual Learning | Mar 26, 2025 | Continual LearningKnowledge Distillation | CodeCode Available | 0 |
| Not All Models Suit Expert Offloading: On Local Routing Consistency of Mixture-of-Expert Models | May 21, 2025 | AllCPU | CodeCode Available | 0 |
| Not Eliminate but Aggregate: Post-Hoc Control over Mixture-of-Experts to Address Shortcut Shifts in Natural Language Understanding | Jun 17, 2024 | Mixture-of-ExpertsNatural Language Understanding | CodeCode Available | 0 |
| MLP-KAN: Unifying Deep Representation and Function Learning | Oct 3, 2024 | Kolmogorov-Arnold NetworksMixture-of-Experts | CodeCode Available | 0 |
| Mixture-of-Supernets: Improving Weight-Sharing Supernet Training with Architecture-Routed Mixture-of-Experts | Jun 8, 2023 | Language ModelingLanguage Modelling | CodeCode Available | 0 |
| Mixture of Nested Experts: Adaptive Processing of Visual Tokens | Jul 29, 2024 | Mixture-of-Experts | CodeCode Available | 0 |