SOTAVerified

Mixture-of-Experts

Papers

Showing 701750 of 1312 papers

TitleStatusHype
Mixture of Experts Meets Prompt-Based Continual LearningCode1
Graph Sparsification via Mixture of GraphsCode1
Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer ModelsCode2
Unchosen Experts Can Contribute Too: Unleashing MoE Models' Power by Self-ContrastCode1
Sigmoid Gating is More Sample Efficient than Softmax Gating in Mixture of Experts0
xRAG: Extreme Context Compression for Retrieval-augmented Generation with One TokenCode2
DirectMultiStep: Direct Route Generation for Multi-Step RetrosynthesisCode1
Ensemble and Mixture-of-Experts DeepONets For Operator LearningCode0
MeteoRA: Multiple-tasks Embedded LoRA for Large Language ModelsCode1
Learning More Generalized Experts by Merging Experts in Mixture-of-Experts0
Uni-MoE: Scaling Unified Multimodal LLMs with Mixture of ExpertsCode5
Many Hands Make Light Work: Task-Oriented Dialogue System with Module-Based Mixture-of-Experts0
M^4oE: A Foundation Model for Medical Multimodal Image Segmentation with Mixture of ExpertsCode1
A Mixture of Experts Approach to 3D Human Motion PredictionCode0
A Mixture-of-Experts Approach to Few-Shot Task Transfer in Open-Ended Text Worlds0
EWMoE: An effective model for global weather forecasting with mixture-of-expertsCode1
CuMo: Scaling Multimodal LLM with Co-Upcycled Mixture-of-ExpertsCode2
SUTRA: Scalable Multilingual Language Model Architecture0
DeepSeek-V2: A Strong, Economical, and Efficient Mixture-of-Experts Language ModelCode9
MEET: Mixture of Experts Extra Tree-Based sEMG Hand Gesture Identification0
WDMoE: Wireless Distributed Large Language Models with Mixture of Experts0
Lory: Fully Differentiable Mixture-of-Experts for Autoregressive Language Model Pre-training0
Mixture of partially linear experts0
MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-ExpertsCode3
Hierarchical mixture of discriminative Generalized Dirichlet classifiers0
Powering In-Database Dynamic Model Slicing for Structured Data Analytics0
Mixture of insighTful Experts (MoTE): The Synergy of Thought Chains and Expert Mixtures in Self-Alignment0
MoPEFT: A Mixture-of-PEFTs for the Segment Anything Model0
Lancet: Accelerating Mixture-of-Experts Training via Whole Graph Computation-Communication Overlapping0
Revisiting RGBT Tracking Benchmarks from the Perspective of Modality Validity: A New Benchmark, Problem, and MethodCode1
Mix of Experts Language Model for Named Entity Recognition0
M3oE: Multi-Domain Multi-Task Mixture-of Experts Recommendation FrameworkCode1
Swin2-MoSE: A New Single Image Super-Resolution Model for Remote SensingCode1
Towards Incremental Learning in Large Language Models: A Critical Review0
Large Multi-modality Model Assisted AI-Generated Image Quality AssessmentCode1
Integration of Mixture of Experts and Multimodal Generative AI in Internet of Vehicles: A Survey0
U2++ MoE: Scaling 4.7x parameters with minimal impact on RTF0
Multi-Head Mixture-of-ExpertsCode1
XFT: Unlocking the Power of Code Instruction Tuning by Simply Merging Upcycled Mixture-of-ExpertsCode1
MixLoRA: Enhancing Large Language Models Fine-Tuning with LoRA-based Mixture of ExpertsCode3
A Novel A.I Enhanced Reservoir Characterization with a Combined Mixture of Experts -- NVIDIA Modulus based Physics Informed Neural Operator Forward Model0
A Large-scale Medical Visual Task Adaptation Benchmark0
MoA: Mixture-of-Attention for Subject-Context Disentanglement in Personalized Image Generation0
Med-MoE: Mixture of Domain-Specific Experts for Lightweight Medical Vision-Language ModelsCode2
Generative AI Agents with Large Language Model for Satellite Networks via a Mixture of Experts Transmission0
Intuition-aware Mixture-of-Rank-1-Experts for Parameter Efficient Finetuning0
Countering Mainstream Bias via End-to-End Adaptive Local LearningCode0
Mixture of Experts Soften the Curse of Dimensionality in Operator Learning0
MoE-FFD: Mixture of Experts for Generalized and Parameter-Efficient Face Forgery DetectionCode2
JetMoE: Reaching Llama2 Performance with 0.1M DollarsCode4
Show:102550
← PrevPage 15 of 27Next →

No leaderboard results yet.