| Mixture of Experts for Node Classification | Nov 30, 2024 | ClassificationMixture-of-Experts | —Unverified | 0 |
| MQFL-FHE: Multimodal Quantum Federated Learning Framework with Fully Homomorphic Encryption | Nov 30, 2024 | Federated LearningMixture-of-Experts | —Unverified | 0 |
| HiMoE: Heterogeneity-Informed Mixture-of-Experts for Fair Spatial-Temporal Forecasting | Nov 30, 2024 | FairnessMixture-of-Experts | —Unverified | 0 |
| LaVIDE: A Language-Vision Discriminator for Detecting Changes in Satellite Image with Map References | Nov 29, 2024 | Change DetectionMixture-of-Experts | —Unverified | 0 |
| On the effectiveness of discrete representations in sparse mixture of experts | Nov 28, 2024 | Mixture-of-ExpertsQuantization | —Unverified | 0 |
| Mixture of Cache-Conditional Experts for Efficient Mobile Device Inference | Nov 27, 2024 | GSM8KLanguage Modeling | —Unverified | 0 |
| Mixture of Experts in Image Classification: What's the Sweet Spot? | Nov 27, 2024 | image-classificationImage Classification | —Unverified | 0 |
| Complexity Experts are Task-Discriminative Learners for Any Image Restoration | Nov 27, 2024 | AttributeBlind All-in-One Image Restoration | —Unverified | 0 |
| UOE: Unlearning One Expert Is Enough For Mixture-of-experts LLMS | Nov 27, 2024 | Large Language ModelMixture-of-Experts | —Unverified | 0 |
| Condense, Don't Just Prune: Enhancing Efficiency and Performance in MoE Layer Pruning | Nov 26, 2024 | Mixture-of-Experts | CodeCode Available | 1 |