| A Unified Virtual Mixture-of-Experts Framework:Enhanced Inference and Hallucination Mitigation in Single-Model System | Apr 1, 2025 | Dialogue GenerationEnsemble Learning | —Unverified | 0 |
| DynMoLE: Boosting Mixture of LoRA Experts Fine-Tuning with a Hybrid Routing Mechanism | Apr 1, 2025 | Common Sense ReasoningComputational Efficiency | CodeCode Available | 0 |
| Detecting Financial Fraud with Hybrid Deep Learning: A Mix-of-Experts Approach to Sequential and Anomalous Patterns | Apr 1, 2025 | Fraud DetectionMixture-of-Experts | —Unverified | 0 |
| Unimodal-driven Distillation in Multimodal Emotion Recognition with Dynamic Fusion | Mar 31, 2025 | Emotion RecognitionKnowledge Distillation | —Unverified | 0 |
| Mixture of Routers | Mar 30, 2025 | Mixture-of-Expertsparameter-efficient fine-tuning | —Unverified | 0 |
| Sparse Mixture of Experts as Unified Competitive Learning | Mar 29, 2025 | Language ModelingLanguage Modelling | —Unverified | 0 |
| Beyond Standard MoE: Mixture of Latent Experts for Resource-Efficient Language Models | Mar 29, 2025 | Computational EfficiencyMixture-of-Experts | —Unverified | 0 |
| S2MoE: Robust Sparse Mixture of Experts via Stochastic Learning | Mar 29, 2025 | Mixture-of-Experts | —Unverified | 0 |
| Exploiting Mixture-of-Experts Redundancy Unlocks Multimodal Generative Abilities | Mar 28, 2025 | Mixture-of-ExpertsText Generation | —Unverified | 0 |
| iMedImage Technical Report | Mar 27, 2025 | Anomaly DetectionDiagnostic | —Unverified | 0 |