| FuxiMT: Sparsifying Large Language Models for Chinese-Centric Multilingual Machine Translation | May 20, 2025 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |
| FuseMoE: Mixture-of-Experts Transformers for Fleximodal Fusion | Feb 5, 2024 | Missing ElementsMixture-of-Experts | —Unverified | 0 | 0 |
| Continual Traffic Forecasting via Mixture of Experts | Jun 5, 2024 | Continual LearningMixture-of-Experts | —Unverified | 0 | 0 |
| Improving Transformer Performance for French Clinical Notes Classification Using Mixture of Experts on a Limited Dataset | Mar 22, 2023 | Mixture-of-Expertstext-classification | —Unverified | 0 | 0 |
| Functional mixture-of-experts for classification | Feb 28, 2022 | ClassificationMixture-of-Experts | —Unverified | 0 | 0 |
| Functional-level Uncertainty Quantification for Calibrated Fine-tuning on LLMs | Oct 9, 2024 | Common Sense ReasoningMixture-of-Experts | —Unverified | 0 | 0 |
| Continual Pre-training of MoEs: How robust is your router? | Mar 6, 2025 | DecoderMixture-of-Experts | —Unverified | 0 | 0 |
| Full-Precision Free Binary Graph Neural Networks | Sep 29, 2021 | Graph Neural NetworkMixture-of-Experts | —Unverified | 0 | 0 |
| Continual Learning Using Task Conditional Neural Networks | Sep 29, 2021 | Continual LearningMixture-of-Experts | —Unverified | 0 | 0 |
| A General Theory for Softmax Gating Multinomial Logistic Mixture of Experts | Oct 22, 2023 | Density EstimationMixture-of-Experts | —Unverified | 0 | 0 |
| FSMoE: A Flexible and Scalable Training System for Sparse Mixture-of-Experts Models | Jan 18, 2025 | GPUMixture-of-Experts | —Unverified | 0 | 0 |
| ContextWIN: Whittle Index Based Mixture-of-Experts Neural Model For Restless Bandits Via Deep RL | Oct 13, 2024 | Decision MakingMixture-of-Experts | —Unverified | 0 | 0 |
| From Google Gemini to OpenAI Q* (Q-Star): A Survey of Reshaping the Generative Artificial Intelligence (AI) Research Landscape | Dec 18, 2023 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Fresh-CL: Feature Realignment through Experts on Hypersphere in Continual Learning | Jan 4, 2025 | Continual LearningMixture-of-Experts | —Unverified | 0 | 0 |
| Contextual Policy Transfer in Reinforcement Learning Domains via Deep Mixtures-of-Experts | Feb 29, 2020 | Mixture-of-ExpertsOpenAI Gym | —Unverified | 0 | 0 |
| A Simple Architecture for Enterprise Large Language Model Applications based on Role based security and Clearance Levels using Retrieval-Augmented Generation or Mixture of Experts | Jul 9, 2024 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |
| Contextual Mixture of Experts: Integrating Knowledge into Predictive Modeling | Nov 1, 2022 | Mixture-of-Experts | —Unverified | 0 | 0 |
| FreqMoE: Dynamic Frequency Enhancement for Neural PDE Solvers | May 11, 2025 | Computational EfficiencyMixture-of-Experts | —Unverified | 0 | 0 |
| Free Agent in Agent-Based Mixture-of-Experts Generative AI Framework | Jan 29, 2025 | Fraud DetectionMixture-of-Experts | —Unverified | 0 | 0 |
| ConstitutionalExperts: Training a Mixture of Principle-based Prompts | Mar 7, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |
| A similarity-based Bayesian mixture-of-experts model | Dec 3, 2020 | Mixture-of-Expertsmodel | —Unverified | 0 | 0 |
| A Generalist Cross-Domain Molecular Learning Framework for Structure-Based Drug Discovery | Mar 6, 2025 | DenoisingDrug Discovery | —Unverified | 0 | 0 |
| Adapted-MoE: Mixture of Experts with Test-Time Adaption for Anomaly Detection | Sep 9, 2024 | Anomaly DetectionMixture-of-Experts | —Unverified | 0 | 0 |
| ForceVLA: Enhancing VLA Models with a Force-aware MoE for Contact-rich Manipulation | May 28, 2025 | Contact-rich ManipulationMixture-of-Experts | —Unverified | 0 | 0 |
| FMT:A Multimodal Pneumonia Detection Model Based on Stacking MOE Framework | Mar 7, 2025 | DiagnosticMedical Image Analysis | —Unverified | 0 | 0 |