| Configurable Foundation Models: Building LLMs from a Modular Perspective | Sep 4, 2024 | Computational EfficiencyMixture-of-Experts | —Unverified | 0 |
| 3D-MoE: A Mixture-of-Experts Multi-modal LLM for 3D Vision and Pose Diffusion via Rectified Flow | Jan 28, 2025 | Instruction FollowingMixture-of-Experts | —Unverified | 0 |
| Conditional computation in neural networks: principles and research trends | Mar 12, 2024 | Mixture-of-Expertsscientific discovery | —Unverified | 0 |
| Mixture-of-Experts Meets Instruction Tuning:A Winning Combination for Large Language Models | May 24, 2023 | Mixture-of-ExpertsZero-shot Generalization | —Unverified | 0 |
| On DeepSeekMoE: Statistical Benefits of Shared Experts and Normalized Sigmoid Gating | May 16, 2025 | Language ModelingLanguage Modelling | —Unverified | 0 |
| Imitation Learning from Observations: An Autoregressive Mixture of Experts Approach | Nov 12, 2024 | Autonomous DrivingImitation Learning | —Unverified | 0 |
| Fixing MoE Over-Fitting on Low-Resource Languages in Multilingual Machine Translation | Dec 15, 2022 | Machine TranslationMixture-of-Experts | —Unverified | 0 |
| FinTeamExperts: Role Specialized MOEs For Financial Analysis | Oct 28, 2024 | Financial AnalysisMixture-of-Experts | —Unverified | 0 |
| On the Adaptation to Concept Drift for CTR Prediction | Apr 1, 2022 | Click-Through Rate PredictionIncremental Learning | —Unverified | 0 |
| FlexMoE: Scaling Large-scale Sparse Pre-trained Model Training via Dynamic Device Placement | Apr 8, 2023 | Mixture-of-ExpertsScheduling | —Unverified | 0 |
| FloE: On-the-Fly MoE Inference on Memory-constrained GPU | May 9, 2025 | CPUGPU | —Unverified | 0 |
| fMoE: Fine-Grained Expert Offloading for Large Mixture-of-Experts Serving | Feb 7, 2025 | CPUGPU | —Unverified | 0 |
| FMT:A Multimodal Pneumonia Detection Model Based on Stacking MOE Framework | Mar 7, 2025 | DiagnosticMedical Image Analysis | —Unverified | 0 |
| ForceVLA: Enhancing VLA Models with a Force-aware MoE for Contact-rich Manipulation | May 28, 2025 | Contact-rich ManipulationMixture-of-Experts | —Unverified | 0 |
| A Review of Sparse Expert Models in Deep Learning | Sep 4, 2022 | Deep LearningMixture-of-Experts | —Unverified | 0 |
| iMedImage Technical Report | Mar 27, 2025 | Anomaly DetectionDiagnostic | —Unverified | 0 |
| FineQuant: Unlocking Efficiency with Fine-Grained Weight-Only Quantization for LLMs | Aug 16, 2023 | GPUMixture-of-Experts | —Unverified | 0 |
| Complexity Experts are Task-Discriminative Learners for Any Image Restoration | Nov 27, 2024 | AttributeBlind All-in-One Image Restoration | —Unverified | 0 |
| Fresh-CL: Feature Realignment through Experts on Hypersphere in Continual Learning | Jan 4, 2025 | Continual LearningMixture-of-Experts | —Unverified | 0 |
| From Google Gemini to OpenAI Q* (Q-Star): A Survey of Reshaping the Generative Artificial Intelligence (AI) Research Landscape | Dec 18, 2023 | Mixture-of-Experts | —Unverified | 0 |
| ContextWIN: Whittle Index Based Mixture-of-Experts Neural Model For Restless Bandits Via Deep RL | Oct 13, 2024 | Decision MakingMixture-of-Experts | —Unverified | 0 |
| FSMoE: A Flexible and Scalable Training System for Sparse Mixture-of-Experts Models | Jan 18, 2025 | GPUMixture-of-Experts | —Unverified | 0 |
| Continual Learning Using Task Conditional Neural Networks | Sep 29, 2021 | Continual LearningMixture-of-Experts | —Unverified | 0 |
| Full-Precision Free Binary Graph Neural Networks | Sep 29, 2021 | Graph Neural NetworkMixture-of-Experts | —Unverified | 0 |
| Functional-level Uncertainty Quantification for Calibrated Fine-tuning on LLMs | Oct 9, 2024 | Common Sense ReasoningMixture-of-Experts | —Unverified | 0 |
| Functional mixture-of-experts for classification | Feb 28, 2022 | ClassificationMixture-of-Experts | —Unverified | 0 |
| FuseMoE: Mixture-of-Experts Transformers for Fleximodal Fusion | Feb 5, 2024 | Missing ElementsMixture-of-Experts | —Unverified | 0 |
| FuxiMT: Sparsifying Large Language Models for Chinese-Centric Multilingual Machine Translation | May 20, 2025 | Language ModelingLanguage Modelling | —Unverified | 0 |
| Finding Fantastic Experts in MoEs: A Unified Study for Expert Dropping Strategies and Observations | Apr 8, 2025 | Instruction FollowingMixture-of-Experts | —Unverified | 0 |
| Filtered not Mixed: Stochastic Filtering-Based Online Gating for Mixture of Large Language Models | Jun 5, 2024 | Mixture-of-ExpertsTime Series | —Unverified | 0 |
| A Review of DeepSeek Models' Key Innovative Techniques | Mar 14, 2025 | Mixture-of-Expertsreinforcement-learning | —Unverified | 0 |
| AdaMV-MoE: Adaptive Multi-Task Vision Mixture-of-Experts | Jan 1, 2023 | Instance SegmentationMixture-of-Experts | —Unverified | 0 |
| Gating Dropout: Communication-efficient Regularization for Sparsely Activated Transformers | May 28, 2022 | Machine TranslationMixture-of-Experts | —Unverified | 0 |
| Imitation Learning from MPC for Quadrupedal Multi-Gait Control | Mar 26, 2021 | Imitation LearningMixture-of-Experts | —Unverified | 0 |
| Coordination with Humans via Strategy Matching | Oct 27, 2022 | Mixture-of-Experts | —Unverified | 0 |
| GEMNET: Effective Gated Gazetteer Representations for Recognizing Complex Entities in Low-context Input | Jun 1, 2021 | Mixture-of-Expertsnamed-entity-recognition | —Unverified | 0 |
| Generalizable Person Re-identification with Relevance-aware Mixture of Experts | May 19, 2021 | Generalizable Person Re-identificationMixture-of-Experts | —Unverified | 0 |
| Generalization Error Analysis for Sparse Mixture-of-Experts: A Preliminary Study | Mar 26, 2024 | Learning TheoryMixture-of-Experts | —Unverified | 0 |
| Improved Training of Mixture-of-Experts Language GANs | Feb 23, 2023 | Adversarial TextImage Generation | —Unverified | 0 |
| Affect in Tweets Using Experts Model | Mar 20, 2019 | Mixture-of-Expertsmodel | —Unverified | 0 |
| Generator Assisted Mixture of Experts For Feature Acquisition in Batch | Dec 19, 2023 | Mixture-of-Experts | —Unverified | 0 |
| GeRM: A Generalist Robotic Model with Mixture-of-experts for Quadruped Robot | Mar 20, 2024 | Mixture-of-ExpertsMulti-Task Learning | —Unverified | 0 |
| Learning to Specialize: Joint Gating-Expert Training for Adaptive MoEs in Decentralized Settings | Jun 14, 2023 | DiversityFederated Learning | —Unverified | 0 |
| GigaChat Family: Efficient Russian Language Modeling Through Mixture of Experts Architecture | Jun 11, 2025 | Language ModelingLanguage Modelling | —Unverified | 0 |
| GLA in MediaEval 2018 Emotional Impact of Movies Task | Nov 27, 2019 | Mixture-of-Experts | —Unverified | 0 |
| GLaM: Efficient Scaling of Language Models with Mixture-of-Experts | Dec 13, 2021 | Common Sense ReasoningIn-Context Learning | —Unverified | 0 |
| FedMoE: Personalized Federated Learning via Heterogeneous Mixture of Experts | Aug 21, 2024 | Federated LearningHeuristic Search | —Unverified | 0 |
| IDEA: An Inverse Domain Expert Adaptation Based Active DNN IP Protection Method | Sep 29, 2024 | Domain AdaptationMixture-of-Experts | —Unverified | 0 |
| FedMoE-DA: Federated Mixture of Experts via Domain Aware Fine-grained Aggregation | Nov 4, 2024 | Federated LearningMixture-of-Experts | —Unverified | 0 |
| Hypertext Entity Extraction in Webpage | Mar 4, 2024 | Mixture-of-Experts | —Unverified | 0 |