| MxMoE: Mixed-precision Quantization for MoE with Accuracy and Performance Co-Design | May 9, 2025 | Mixture-of-ExpertsQuantization | CodeCode Available | 1 |
| Divide-and-Conquer: Cold-Start Bundle Recommendation via Mixture of Diffusion Experts | May 8, 2025 | Mixture-of-Experts | —Unverified | 0 |
| SToLa: Self-Adaptive Touch-Language Framework with Tactile Commonsense Reasoning in Open-Ended Scenarios | May 7, 2025 | DiversityMixture-of-Experts | —Unverified | 0 |
| Pangu Ultra MoE: How to Train Your Big MoE on Ascend NPUs | May 7, 2025 | Mixture-of-Experts | —Unverified | 0 |
| LLM-e Guess: Can LLMs Capabilities Advance Without Hardware Progress? | May 7, 2025 | Large Language ModelMixture-of-Experts | CodeCode Available | 0 |
| STAR-Rec: Making Peace with Length Variance and Pattern Diversity in Sequential Recommendation | May 6, 2025 | DiversityMixture-of-Experts | —Unverified | 0 |
| Faster MoE LLM Inference for Extremely Large Models | May 6, 2025 | Inference OptimizationMixture-of-Experts | —Unverified | 0 |
| Towards Smart Point-and-Shoot Photography | May 6, 2025 | Mixture-of-ExpertsWord Embeddings | —Unverified | 0 |
| 3D Gaussian Splatting Data Compression with Mixture of Priors | May 6, 2025 | 3DGSData Compression | —Unverified | 0 |
| Multimodal Deep Learning-Empowered Beam Prediction in Future THz ISAC Systems | May 5, 2025 | Beam PredictionDeep Learning | —Unverified | 0 |