SOTAVerified

Multimodal Sentiment Analysis

Multimodal sentiment analysis is the task of performing sentiment analysis with multiple data sources - e.g. a camera feed of someone's face and their recorded speech.

( Image credit: ICON: Interactive Conversational Memory Network for Multimodal Emotion Detection )

Papers

Showing 2650 of 202 papers

TitleStatusHype
Multimodal Multi-loss Fusion Network for Sentiment AnalysisCode1
MuSe-Toolbox: The Multimodal Sentiment Analysis Continuous Annotation Fusion and Discrete Class Transformation ToolboxCode1
Cooperative Sentiment Agents for Multimodal Sentiment AnalysisCode1
A Transformer-based joint-encoding for Emotion Recognition and Sentiment AnalysisCode1
EmoVerse: Exploring Multimodal Large Language Models for Sentiment and Emotion UnderstandingCode1
Analyzing Modality Robustness in Multimodal Sentiment AnalysisCode1
Make Acoustic and Visual Cues Matter: CH-SIMS v2.0 Dataset and AV-Mixup Consistent ModuleCode1
Improving Multimodal Fusion with Hierarchical Mutual Information Maximization for Multimodal Sentiment AnalysisCode1
Cross-Modal BERT for Text-Audio Sentiment AnalysisCode1
CTFN: Hierarchical Learning for Multimodal Sentiment Analysis Using Coupled-Translation Fusion NetworkCode1
CubeMLP: An MLP-based Model for Multimodal Sentiment Analysis and Depression EstimationCode1
Learning Language-guided Adaptive Hyper-modality Representation for Multimodal Sentiment AnalysisCode1
Learning Modality-Specific Representations with Self-Supervised Multi-Task Learning for Multimodal Sentiment AnalysisCode1
Deep-HOSeq: Deep Higher Order Sequence Fusion for Multimodal Sentiment AnalysisCode1
Bi-Bimodal Modality Fusion for Correlation-Controlled Multimodal Sentiment AnalysisCode1
Bridging the Gap for Test-Time Multimodal Sentiment AnalysisCode1
Jointly Fine-Tuning “BERT-like” Self Supervised Models to Improve Multimodal Speech Emotion RecognitionCode1
Tracing Intricate Cues in Dialogue: Joint Graph Structure and Sentiment Dynamics for Multimodal Emotion RecognitionCode1
MMLatch: Bottom-up Top-down Fusion for Multimodal Sentiment AnalysisCode1
CH-SIMS: A Chinese Multimodal Sentiment Analysis Dataset with Fine-grained Annotation of ModalityCode1
MSAF: Multimodal Split Attention FusionCode1
Counterfactual Reasoning for Out-of-distribution Multimodal Sentiment AnalysisCode1
Efficient Multimodal Transformer with Dual-Level Feature Restoration for Robust Multimodal Sentiment AnalysisCode1
Enhancing Multimodal Sentiment Analysis for Missing Modality through Self-Distillation and Unified Modality Cross-AttentionCode1
Few-shot Multimodal Sentiment Analysis based on Multimodal Probabilistic Fusion PromptsCode1
Show:102550
← PrevPage 2 of 9Next →

No leaderboard results yet.