| ADMoE: Anomaly Detection with Mixture-of-Experts from Noisy Labels | Aug 24, 2022 | Anomaly DetectionMixture-of-Experts | —Unverified | 0 |
| Modular Action Concept Grounding in Semantic Video Prediction | Nov 23, 2020 | Action RecognitionMixture-of-Experts | —Unverified | 0 |
| EPS-MoE: Expert Pipeline Scheduler for Cost-Efficient MoE Inference | Oct 16, 2024 | Computational EfficiencyLarge Language Model | —Unverified | 0 |
| Locking and Quacking: Stacking Bayesian model predictions by log-pooling and superposition | May 12, 2023 | Bayesian InferenceMixture-of-Experts | —Unverified | 0 |
| Ensemble Learning for Large Language Models in Text and Code Generation: A Survey | Mar 13, 2025 | Code GenerationEnsemble Learning | —Unverified | 0 |
| Non-asymptotic oracle inequalities for the Lasso in high-dimensional mixture of experts | Sep 22, 2020 | feature selectionMixture-of-Experts | —Unverified | 0 |
| Routing in Sparsely-gated Language Models responds to Context | Sep 21, 2024 | DecoderMixture-of-Experts | —Unverified | 0 |
| Enhancing the "Immunity" of Mixture-of-Experts Networks for Adversarial Defense | Feb 29, 2024 | Adversarial DefenseAdversarial Robustness | —Unverified | 0 |
| Capacity-Aware Inference: Mitigating the Straggler Effect in Mixture of Experts | Mar 7, 2025 | Mixture-of-Experts | —Unverified | 0 |
| Enhancing Multi-modal Models with Heterogeneous MoE Adapters for Fine-tuning | Mar 26, 2025 | Mixture-of-Expertsparameter-efficient fine-tuning | —Unverified | 0 |