| SUTRA: Scalable Multilingual Language Model Architecture | May 7, 2024 | Computational EfficiencyHallucination | —Unverified | 0 |
| Lory: Fully Differentiable Mixture-of-Experts for Autoregressive Language Model Pre-training | May 6, 2024 | Language ModelingLanguage Modelling | —Unverified | 0 |
| MEET: Mixture of Experts Extra Tree-Based sEMG Hand Gesture Identification | May 6, 2024 | Electromyography (EMG)Gesture Recognition | —Unverified | 0 |
| WDMoE: Wireless Distributed Large Language Models with Mixture of Experts | May 6, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Mixture of partially linear experts | May 5, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Hierarchical mixture of discriminative Generalized Dirichlet classifiers | May 2, 2024 | Mixture-of-ExpertsSpam detection | —Unverified | 0 |
| Mixture of insighTful Experts (MoTE): The Synergy of Thought Chains and Expert Mixtures in Self-Alignment | May 1, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Powering In-Database Dynamic Model Slicing for Structured Data Analytics | May 1, 2024 | Mixture-of-Experts | —Unverified | 0 |
| MoPEFT: A Mixture-of-PEFTs for the Segment Anything Model | May 1, 2024 | Mixture-of-Expertsparameter-efficient fine-tuning | —Unverified | 0 |
| Lancet: Accelerating Mixture-of-Experts Training via Whole Graph Computation-Communication Overlapping | Apr 30, 2024 | AllMixture-of-Experts | —Unverified | 0 |
| Mix of Experts Language Model for Named Entity Recognition | Apr 30, 2024 | Language ModelingLanguage Modelling | —Unverified | 0 |
| Towards Incremental Learning in Large Language Models: A Critical Review | Apr 28, 2024 | Continual LearningIncremental Learning | —Unverified | 0 |
| Integration of Mixture of Experts and Multimodal Generative AI in Internet of Vehicles: A Survey | Apr 25, 2024 | Autonomous DrivingDecision Making | —Unverified | 0 |
| U2++ MoE: Scaling 4.7x parameters with minimal impact on RTF | Apr 25, 2024 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 |
| A Novel A.I Enhanced Reservoir Characterization with a Combined Mixture of Experts -- NVIDIA Modulus based Physics Informed Neural Operator Forward Model | Apr 20, 2024 | Mixture-of-ExpertsUncertainty Quantification | —Unverified | 0 |
| A Large-scale Medical Visual Task Adaptation Benchmark | Apr 19, 2024 | Mixture-of-Experts | —Unverified | 0 |
| MoA: Mixture-of-Attention for Subject-Context Disentanglement in Personalized Image Generation | Apr 17, 2024 | DisentanglementImage Generation | —Unverified | 0 |
| Generative AI Agents with Large Language Model for Satellite Networks via a Mixture of Experts Transmission | Apr 14, 2024 | Language ModelingLanguage Modelling | —Unverified | 0 |
| Intuition-aware Mixture-of-Rank-1-Experts for Parameter Efficient Finetuning | Apr 13, 2024 | DiversityMixture-of-Experts | —Unverified | 0 |
| Mixture of Experts Soften the Curse of Dimensionality in Operator Learning | Apr 13, 2024 | Mixture-of-ExpertsOperator learning | —Unverified | 0 |
| Countering Mainstream Bias via End-to-End Adaptive Local Learning | Apr 13, 2024 | Collaborative FilteringMixture-of-Experts | CodeCode Available | 0 |
| Identifying Shopping Intent in Product QA for Proactive Recommendations | Apr 9, 2024 | FrictionMixture-of-Experts | —Unverified | 0 |
| Dense Training, Sparse Inference: Rethinking Training of Mixture-of-Experts Language Models | Apr 8, 2024 | GPUMixture-of-Experts | —Unverified | 0 |
| SEER-MoE: Sparse Expert Efficiency through Regularization for Mixture-of-Experts | Apr 7, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Shortcut-connected Expert Parallelism for Accelerating Mixture-of-Experts | Apr 7, 2024 | Mixture-of-Experts | —Unverified | 0 |