| A Review of DeepSeek Models' Key Innovative Techniques | Mar 14, 2025 | Mixture-of-Expertsreinforcement-learning | —Unverified | 0 | 0 |
| AdaMV-MoE: Adaptive Multi-Task Vision Mixture-of-Experts | Jan 1, 2023 | Instance SegmentationMixture-of-Experts | —Unverified | 0 | 0 |
| Learning to Specialize: Joint Gating-Expert Training for Adaptive MoEs in Decentralized Settings | Jun 14, 2023 | DiversityFederated Learning | —Unverified | 0 | 0 |
| FedMoE: Personalized Federated Learning via Heterogeneous Mixture of Experts | Aug 21, 2024 | Federated LearningHeuristic Search | —Unverified | 0 | 0 |
| FedMoE-DA: Federated Mixture of Experts via Domain Aware Fine-grained Aggregation | Nov 4, 2024 | Federated LearningMixture-of-Experts | —Unverified | 0 | 0 |
| FedMerge: Federated Personalization via Model Merging | Apr 9, 2025 | Federated LearningMixture-of-Experts | —Unverified | 0 | 0 |
| A Provably Effective Method for Pruning Experts in Fine-tuned Sparse Mixture-of-Experts | May 26, 2024 | Binary ClassificationMixture-of-Experts | —Unverified | 0 | 0 |
| Affect in Tweets Using Experts Model | Mar 20, 2019 | Mixture-of-Expertsmodel | —Unverified | 0 | 0 |
| Federated Mixture of Experts | Jul 14, 2021 | Federated LearningMixture-of-Experts | —Unverified | 0 | 0 |
| Federated learning using mixture of experts | Jan 1, 2021 | Federated LearningMixture-of-Experts | —Unverified | 0 | 0 |