| Generalizing Multimodal Variational Methods to Sets | Dec 19, 2022 | Mixture-of-Experts | —Unverified | 0 |
| Mod-Squad: Designing Mixture of Experts As Modular Multi-Task Learners | Dec 15, 2022 | Mixture-of-ExpertsMulti-Task Learning | —Unverified | 0 |
| Fixing MoE Over-Fitting on Low-Resource Languages in Multilingual Machine Translation | Dec 15, 2022 | Machine TranslationMixture-of-Experts | —Unverified | 0 |
| SMILE: Scaling Mixture-of-Experts with Efficient Bi-level Routing | Dec 10, 2022 | Mixture-of-Experts | —Unverified | 0 |
| Incorporating Polar Field Data for Improved Solar Flare Prediction | Dec 4, 2022 | Mixture-of-ExpertsPrediction | —Unverified | 0 |
| Named Entity and Relation Extraction with Multi-Modal Retrieval | Dec 3, 2022 | Mixture-of-ExpertsMulti-modal Named Entity Recognition | —Unverified | 0 |
| Automatically Extracting Information in Medical Dialogue: Expert System And Attention for Labelling | Nov 28, 2022 | Mixture-of-Experts | —Unverified | 0 |
| Double Deep Q-Learning in Opponent Modeling | Nov 24, 2022 | Mixture-of-ExpertsQ-Learning | —Unverified | 0 |
| Who Says Elephants Can't Run: Bringing Large Scale MoE Models into Cloud Scale Production | Nov 18, 2022 | Machine TranslationMixture-of-Experts | —Unverified | 0 |
| A Bird's-eye View of Reranking: from List Level to Page Level | Nov 17, 2022 | Mixture-of-ExpertsRecommendation Systems | CodeCode Available | 0 |