| Mixture of neural operator experts for learning boundary conditions and model selection | Feb 6, 2025 | Mixture-of-ExpertsModel Selection | —Unverified | 0 | 0 |
| Mixture of Parrots: Experts improve memorization more than reasoning | Oct 24, 2024 | MathMemorization | —Unverified | 0 | 0 |
| Mixture of partially linear experts | May 5, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Mixture of Quantized Experts (MoQE): Complementary Effect of Low-bit Quantization and Robustness | Oct 3, 2023 | GPUMachine Translation | —Unverified | 0 | 0 |
| Mixture of Regression Experts in fMRI Encoding | Nov 26, 2018 | Mixture-of-Expertsregression | —Unverified | 0 | 0 |
| Mixture of Routers | Mar 30, 2025 | Mixture-of-Expertsparameter-efficient fine-tuning | —Unverified | 0 | 0 |
| Mixture-of-Shape-Experts (MoSE): End-to-End Shape Dictionary Framework to Prompt SAM for Generalizable Medical Segmentation | Apr 13, 2025 | Dictionary LearningDomain Generalization | —Unverified | 0 | 0 |
| Mixture of Tunable Experts - Behavior Modification of DeepSeek-R1 at Inference Time | Feb 16, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Mixtures of Deep Neural Experts for Automated Speech Scoring | Jun 23, 2021 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 | 0 |
| MJ-VIDEO: Fine-Grained Benchmarking and Rewarding Video Preferences in Video Generation | Feb 3, 2025 | BenchmarkingFairness | —Unverified | 0 | 0 |
| MM1.5: Methods, Analysis & Insights from Multimodal LLM Fine-tuning | Sep 30, 2024 | Mixture-of-ExpertsOptical Character Recognition (OCR) | —Unverified | 0 | 0 |
| MM1: Methods, Analysis & Insights from Multimodal LLM Pre-training | Mar 14, 2024 | In-Context LearningMixture-of-Experts | —Unverified | 0 | 0 |
| MMoE: Robust Spoiler Detection with Multi-modal Information and Domain-aware Mixture-of-Experts | Mar 8, 2024 | Domain GeneralizationMixture-of-Experts | —Unverified | 0 | 0 |
| μ-MoE: Test-Time Pruning as Micro-Grained Mixture-of-Experts | May 24, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| MoA: Mixture-of-Attention for Subject-Context Disentanglement in Personalized Image Generation | Apr 17, 2024 | DisentanglementImage Generation | —Unverified | 0 | 0 |
| MobileFlow: A Multimodal LLM For Mobile GUI Agent | Jul 5, 2024 | Action AnalysisLanguage Modelling | —Unverified | 0 | 0 |
| Mobile V-MoEs: Scaling Down Vision Transformers via Sparse Mixture-of-Experts | Sep 8, 2023 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Mod-Adapter: Tuning-Free and Versatile Multi-concept Personalization via Modulation Adapter | May 24, 2025 | Image GenerationMixture-of-Experts | —Unverified | 0 | 0 |
| MoDE: A Mixture-of-Experts Model with Mutual Distillation among the Experts | Jan 31, 2024 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Model Agnostic Combination for Ensemble Learning | Jun 16, 2020 | Ensemble LearningMixture-of-Experts | —Unverified | 0 | 0 |
| Beyond the Typical: Modeling Rare Plausible Patterns in Chemical Reactions by Leveraging Sequential Mixture-of-Experts | Oct 7, 2023 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Modeling Task Relationships in Multi-variate Soft Sensor with Balanced Mixture-of-Experts | May 25, 2023 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Model Merging in Pre-training of Large Language Models | May 17, 2025 | Mixture-of-Experts | —Unverified | 0 | 0 |
| Model Selection for Gaussian-gated Gaussian Mixture of Experts Using Dendrograms of Mixing Measures | May 19, 2025 | Computational EfficiencyEnsemble Learning | —Unverified | 0 | 0 |
| Mod-Squad: Designing Mixture of Experts As Modular Multi-Task Learners | Dec 15, 2022 | Mixture-of-ExpertsMulti-Task Learning | —Unverified | 0 | 0 |