SOTAVerified

Mixture-of-Experts

Papers

Showing 726750 of 1312 papers

TitleStatusHype
Powering In-Database Dynamic Model Slicing for Structured Data Analytics0
Mixture of insighTful Experts (MoTE): The Synergy of Thought Chains and Expert Mixtures in Self-Alignment0
MoPEFT: A Mixture-of-PEFTs for the Segment Anything Model0
Lancet: Accelerating Mixture-of-Experts Training via Whole Graph Computation-Communication Overlapping0
Revisiting RGBT Tracking Benchmarks from the Perspective of Modality Validity: A New Benchmark, Problem, and MethodCode1
Mix of Experts Language Model for Named Entity Recognition0
M3oE: Multi-Domain Multi-Task Mixture-of Experts Recommendation FrameworkCode1
Swin2-MoSE: A New Single Image Super-Resolution Model for Remote SensingCode1
Towards Incremental Learning in Large Language Models: A Critical Review0
Large Multi-modality Model Assisted AI-Generated Image Quality AssessmentCode1
Integration of Mixture of Experts and Multimodal Generative AI in Internet of Vehicles: A Survey0
U2++ MoE: Scaling 4.7x parameters with minimal impact on RTF0
Multi-Head Mixture-of-ExpertsCode1
XFT: Unlocking the Power of Code Instruction Tuning by Simply Merging Upcycled Mixture-of-ExpertsCode1
MixLoRA: Enhancing Large Language Models Fine-Tuning with LoRA-based Mixture of ExpertsCode3
A Novel A.I Enhanced Reservoir Characterization with a Combined Mixture of Experts -- NVIDIA Modulus based Physics Informed Neural Operator Forward Model0
A Large-scale Medical Visual Task Adaptation Benchmark0
MoA: Mixture-of-Attention for Subject-Context Disentanglement in Personalized Image Generation0
Med-MoE: Mixture of Domain-Specific Experts for Lightweight Medical Vision-Language ModelsCode2
Generative AI Agents with Large Language Model for Satellite Networks via a Mixture of Experts Transmission0
Intuition-aware Mixture-of-Rank-1-Experts for Parameter Efficient Finetuning0
Countering Mainstream Bias via End-to-End Adaptive Local LearningCode0
Mixture of Experts Soften the Curse of Dimensionality in Operator Learning0
MoE-FFD: Mixture of Experts for Generalized and Parameter-Efficient Face Forgery DetectionCode2
JetMoE: Reaching Llama2 Performance with 0.1M DollarsCode4
Show:102550
← PrevPage 30 of 53Next →

No leaderboard results yet.