| Enhancing Multi-modal Models with Heterogeneous MoE Adapters for Fine-tuning | Mar 26, 2025 | Mixture-of-Expertsparameter-efficient fine-tuning | —Unverified | 0 |
| Optimal Scaling Laws for Efficiency Gains in a Theoretical Transformer-Augmented Sectional MoE Framework | Mar 26, 2025 | Computational EfficiencyMixture-of-Experts | —Unverified | 0 |
| M^2CD: A Unified MultiModal Framework for Optical-SAR Change Detection with Mixture of Experts and Self-Distillation | Mar 25, 2025 | Change DetectionDisaster Response | —Unverified | 0 |
| Resilient Sensor Fusion under Adverse Sensor Failures via Multi-Modal Expert Fusion | Mar 25, 2025 | Autonomous DrivingMixture-of-Experts | —Unverified | 0 |
| BiPrompt-SAM: Enhancing Image Segmentation via Explicit Selection between Point and Text Prompts | Mar 25, 2025 | Image SegmentationMixture-of-Experts | —Unverified | 0 |
| Galaxy Walker: Geometry-aware VLMs For Galaxy-scale Understanding | Mar 24, 2025 | Mixture-of-ExpertsMorphology classification | —Unverified | 0 |
| SPMTrack: Spatio-Temporal Parameter-Efficient Fine-Tuning with Mixture of Experts for Scalable Visual Tracking | Mar 24, 2025 | Mixture-of-Expertsparameter-efficient fine-tuning | CodeCode Available | 1 |
| ExpertRAG: Efficient RAG with Mixture of Experts -- Optimizing Context Retrieval for Adaptive LLM Responses | Mar 23, 2025 | Language ModelingLanguage Modelling | —Unverified | 0 |
| Every Sample Matters: Leveraging Mixture-of-Experts and High-Quality Data for Efficient and Accurate Code LLM | Mar 22, 2025 | Code GenerationMixture-of-Experts | —Unverified | 0 |
| Mixture of Lookup Experts | Mar 20, 2025 | Mixture-of-Experts | CodeCode Available | 2 |