| ForceVLA: Enhancing VLA Models with a Force-aware MoE for Contact-rich Manipulation | May 28, 2025 | Contact-rich ManipulationMixture-of-Experts | —Unverified | 0 |
| A Human-Centric Approach to Explainable AI for Personalized Education | May 28, 2025 | Autonomous DrivingMixture-of-Experts | CodeCode Available | 0 |
| Advancing Expert Specialization for Better MoE | May 28, 2025 | Mixture-of-Experts | —Unverified | 0 |
| EvoMoE: Expert Evolution in Mixture of Experts for Multimodal Large Language Models | May 28, 2025 | Mixture-of-ExpertsMME | —Unverified | 0 |
| MoE-Gyro: Self-Supervised Over-Range Reconstruction and Denoising for MEMS Gyroscopes | May 27, 2025 | BenchmarkingDenoising | —Unverified | 0 |
| Mosaic: Data-Free Knowledge Distillation via Mixture-of-Experts for Heterogeneous Distributed Environments | May 26, 2025 | Data-free Knowledge DistillationFederated Learning | CodeCode Available | 0 |
| NEXT: Multi-Grained Mixture of Experts via Text-Modulation for Multi-Modal Object Re-ID | May 26, 2025 | AttributeCaption Generation | —Unverified | 0 |
| MoESD: Unveil Speculative Decoding's Potential for Accelerating Sparse MoE | May 26, 2025 | Mixture-of-Experts | —Unverified | 0 |
| Rethinking Gating Mechanism in Sparse MoE: Handling Arbitrary Modality Inputs with Confidence-Guided Gate | May 26, 2025 | ImputationMixture-of-Experts | CodeCode Available | 0 |
| Integrating Dynamical Systems Learning with Foundational Models: A Meta-Evolutionary AI Framework for Clinical Trials | May 25, 2025 | Evolutionary AlgorithmsLarge Language Model | —Unverified | 0 |