| MoS: Unleashing Parameter Efficiency of Low-Rank Adaptation with Mixture of Shards | Oct 1, 2024 | GPUMixture-of-Experts | —Unverified | 0 |
| Robust Traffic Forecasting against Spatial Shift over Years | Oct 1, 2024 | AttributeMixture-of-Experts | CodeCode Available | 0 |
| MM1.5: Methods, Analysis & Insights from Multimodal LLM Fine-tuning | Sep 30, 2024 | Mixture-of-ExpertsOptical Character Recognition (OCR) | —Unverified | 0 |
| IDEA: An Inverse Domain Expert Adaptation Based Active DNN IP Protection Method | Sep 29, 2024 | Domain AdaptationMixture-of-Experts | —Unverified | 0 |
| CLIP-MoE: Towards Building Mixture of Experts for CLIP with Diversified Multiplet Upcycling | Sep 28, 2024 | image-classificationImage Classification | CodeCode Available | 2 |
| SciDFM: A Large Language Model with Mixture-of-Experts for Science | Sep 27, 2024 | Language ModelingLanguage Modelling | —Unverified | 0 |
| A Time Series is Worth Five Experts: Heterogeneous Mixture of Experts for Traffic Flow Prediction | Sep 26, 2024 | Mixture-of-ExpertsPrediction | CodeCode Available | 1 |
| Uni-Med: A Unified Medical Generalist Foundation Model For Multi-Task Learning Via Connector-MoE | Sep 26, 2024 | image-classificationImage Classification | CodeCode Available | 1 |
| Leveraging Mixture of Experts for Improved Speech Deepfake Detection | Sep 24, 2024 | DeepFake DetectionFace Swapping | —Unverified | 0 |
| Toward Mixture-of-Experts Enabled Trustworthy Semantic Communication for 6G Networks | Sep 24, 2024 | Mixture-of-ExpertsSemantic Communication | —Unverified | 0 |