| SC-MoE: Switch Conformer Mixture of Experts for Unified Streaming and Non-streaming Code-Switching ASR | Jun 26, 2024 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 |
| Mixture of Experts in a Mixture of RL settings | Jun 26, 2024 | Deep Reinforcement LearningMixture-of-Experts | —Unverified | 0 |
| MoESD: Mixture of Experts Stable Diffusion to Mitigate Gender Bias | Jun 25, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Peirce in the Machine: How Mixture of Experts Models Perform Hypothesis Construction | Jun 24, 2024 | Mixture-of-Experts | CodeCode Available | 0 |
| Theory on Mixture-of-Experts in Continual Learning | Jun 24, 2024 | Continual LearningMixture-of-Experts | —Unverified | 0 |
| LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training | Jun 24, 2024 | Mixture-of-Experts | CodeCode Available | 5 |
| OTCE: Hybrid SSM and Attention with Cross Domain Mixture of Experts to construct Observer-Thinker-Conceiver-Expresser | Jun 24, 2024 | Language ModelingLanguage Modelling | CodeCode Available | 0 |
| SimSMoE: Solving Representational Collapse via Similarity Measure | Jun 22, 2024 | Mixture-of-Experts | —Unverified | 0 |
| Low-Rank Mixture-of-Experts for Continual Medical Image Segmentation | Jun 19, 2024 | Continual LearningImage Segmentation | —Unverified | 0 |
| AdaMoE: Token-Adaptive Routing with Null Experts for Mixture-of-Experts Language Models | Jun 19, 2024 | ARCMixture-of-Experts | CodeCode Available | 1 |