| Integrating Dynamical Systems Learning with Foundational Models: A Meta-Evolutionary AI Framework for Clinical Trials | May 25, 2025 | Evolutionary AlgorithmsLarge Language Model | —Unverified | 0 |
| μ-MoE: Test-Time Pruning as Micro-Grained Mixture-of-Experts | May 24, 2025 | Mixture-of-Experts | —Unverified | 0 |
| On Minimax Estimation of Parameters in Softmax-Contaminated Mixture of Experts | May 24, 2025 | Mixture-of-Experts | —Unverified | 0 |
| Guiding the Experts: Semantic Priors for Efficient and Focused MoE Routing | May 24, 2025 | Mixture-of-Experts | CodeCode Available | 0 |
| Mod-Adapter: Tuning-Free and Versatile Multi-concept Personalization via Modulation Adapter | May 24, 2025 | Image GenerationMixture-of-Experts | —Unverified | 0 |
| TrajMoE: Spatially-Aware Mixture of Experts for Unified Human Mobility Modeling | May 24, 2025 | Mixture-of-Experts | —Unverified | 0 |
| EvidenceMoE: A Physics-Guided Mixture-of-Experts with Evidential Critics for Advancing Fluorescence Light Detection and Ranging in Scattering Media | May 23, 2025 | Depth EstimationMixture-of-Experts | —Unverified | 0 |
| DualComp: End-to-End Learning of a Unified Dual-Modality Lossless Compressor | May 22, 2025 | Mixture-of-Experts | —Unverified | 0 |
| DriveMoE: Mixture-of-Experts for Vision-Language-Action Model in End-to-End Autonomous Driving | May 22, 2025 | Autonomous DrivingBench2Drive | —Unverified | 0 |
| Time Tracker: Mixture-of-Experts-Enhanced Foundation Time Series Forecasting Model with Decoupled Training Pipelines | May 21, 2025 | Graph LearningMixture-of-Experts | —Unverified | 0 |