| A Novel A.I Enhanced Reservoir Characterization with a Combined Mixture of Experts -- NVIDIA Modulus based Physics Informed Neural Operator Forward Model | Apr 20, 2024 | Mixture-of-ExpertsUncertainty Quantification | —Unverified | 0 |
| A Large-scale Medical Visual Task Adaptation Benchmark | Apr 19, 2024 | Mixture-of-Experts | —Unverified | 0 |
| MoA: Mixture-of-Attention for Subject-Context Disentanglement in Personalized Image Generation | Apr 17, 2024 | DisentanglementImage Generation | —Unverified | 0 |
| Med-MoE: Mixture of Domain-Specific Experts for Lightweight Medical Vision-Language Models | Apr 16, 2024 | image-classificationImage Classification | CodeCode Available | 2 |
| Generative AI Agents with Large Language Model for Satellite Networks via a Mixture of Experts Transmission | Apr 14, 2024 | Language ModelingLanguage Modelling | —Unverified | 0 |
| Intuition-aware Mixture-of-Rank-1-Experts for Parameter Efficient Finetuning | Apr 13, 2024 | DiversityMixture-of-Experts | —Unverified | 0 |
| Countering Mainstream Bias via End-to-End Adaptive Local Learning | Apr 13, 2024 | Collaborative FilteringMixture-of-Experts | CodeCode Available | 0 |
| Mixture of Experts Soften the Curse of Dimensionality in Operator Learning | Apr 13, 2024 | Mixture-of-ExpertsOperator learning | —Unverified | 0 |
| MoE-FFD: Mixture of Experts for Generalized and Parameter-Efficient Face Forgery Detection | Apr 12, 2024 | Mixture-of-Experts | CodeCode Available | 2 |
| JetMoE: Reaching Llama2 Performance with 0.1M Dollars | Apr 11, 2024 | GPUMixture-of-Experts | CodeCode Available | 4 |