| A Survey on Model MoErging: Recycling and Routing Among Specialized Experts for Collaborative Learning | Aug 13, 2024 | Mixture-of-ExpertsSurvey | —Unverified | 0 |
| HoME: Hierarchy of Multi-Gate Experts for Multi-Task Learning at Kuaishou | Aug 10, 2024 | Mixture-of-ExpertsMulti-Task Learning | —Unverified | 0 |
| LaDiMo: Layer-wise Distillation Inspired MoEfier | Aug 8, 2024 | Knowledge DistillationMixture-of-Experts | —Unverified | 0 |
| Understanding the Performance and Estimating the Cost of LLM Fine-Tuning | Aug 8, 2024 | GPUMixture-of-Experts | CodeCode Available | 0 |
| MoC-System: Efficient Fault Tolerance for Sparse Mixture-of-Experts Model Training | Aug 8, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Mixture-of-Noises Enhanced Forgery-Aware Predictor for Multi-Face Manipulation Detection and Localization | Aug 5, 2024 | Face DetectionMixture-of-Experts | —Unverified | 0 |
| HMDN: Hierarchical Multi-Distribution Network for Click-Through Rate Prediction | Aug 2, 2024 | Click-Through Rate PredictionMixture-of-Experts | —Unverified | 0 |
| Multimodal Fusion and Coherence Modeling for Video Topic Segmentation | Aug 1, 2024 | Contrastive LearningMixture-of-Experts | —Unverified | 0 |
| MoMa: Efficient Early-Fusion Pre-training with Mixture of Modality-Aware Experts | Jul 31, 2024 | Causal InferenceLanguage Modelling | —Unverified | 0 |
| PMoE: Progressive Mixture of Experts with Asymmetric Transformer for Continual Learning | Jul 31, 2024 | Continual LearningGeneral Knowledge | —Unverified | 0 |