| Mixture-of-Experts Meets Instruction Tuning:A Winning Combination for Large Language Models | May 24, 2023 | Mixture-of-ExpertsZero-shot Generalization | —Unverified | 0 | 0 |
| Conditional computation in neural networks: principles and research trends | Mar 12, 2024 | Mixture-of-Expertsscientific discovery | —Unverified | 0 | 0 |
| Fixing MoE Over-Fitting on Low-Resource Languages in Multilingual Machine Translation | Dec 15, 2022 | Machine TranslationMixture-of-Experts | —Unverified | 0 | 0 |
| FinTeamExperts: Role Specialized MOEs For Financial Analysis | Oct 28, 2024 | Financial AnalysisMixture-of-Experts | —Unverified | 0 | 0 |
| On the Adaptation to Concept Drift for CTR Prediction | Apr 1, 2022 | Click-Through Rate PredictionIncremental Learning | —Unverified | 0 | 0 |
| A Review of Sparse Expert Models in Deep Learning | Sep 4, 2022 | Deep LearningMixture-of-Experts | —Unverified | 0 | 0 |
| FineQuant: Unlocking Efficiency with Fine-Grained Weight-Only Quantization for LLMs | Aug 16, 2023 | GPUMixture-of-Experts | —Unverified | 0 | 0 |
| Finding Fantastic Experts in MoEs: A Unified Study for Expert Dropping Strategies and Observations | Apr 8, 2025 | Instruction FollowingMixture-of-Experts | —Unverified | 0 | 0 |
| Filtered not Mixed: Stochastic Filtering-Based Online Gating for Mixture of Large Language Models | Jun 5, 2024 | Mixture-of-ExpertsTime Series | —Unverified | 0 | 0 |
| Complexity Experts are Task-Discriminative Learners for Any Image Restoration | Nov 27, 2024 | AttributeBlind All-in-One Image Restoration | —Unverified | 0 | 0 |