SOTAVerified

Multimodal Sentiment Analysis

Multimodal sentiment analysis is the task of performing sentiment analysis with multiple data sources - e.g. a camera feed of someone's face and their recorded speech.

( Image credit: ICON: Interactive Conversational Memory Network for Multimodal Emotion Detection )

Papers

Showing 125 of 202 papers

TitleStatusHype
Sentiment Reasoning for HealthcareCode3
Towards Robust Multimodal Sentiment Analysis with Incomplete DataCode2
Multimodal Prompt Learning with Missing Modalities for Sentiment Analysis and Emotion RecognitionCode2
Recent Trends of Multimodal Affective Computing: A Survey from NLP PerspectiveCode2
Knowledge-Guided Dynamic Modality Attention Fusion Framework for Multimodal Sentiment AnalysisCode2
M-SENA: An Integrated Platform for Multimodal Sentiment AnalysisCode2
MARLIN: Masked Autoencoder for facial video Representation LearnINgCode2
UniMSE: Towards Unified Multimodal Sentiment Analysis and Emotion RecognitionCode2
DLF: Disentangled-Language-Focused Multimodal Sentiment AnalysisCode2
Learning Language-guided Adaptive Hyper-modality Representation for Multimodal Sentiment AnalysisCode1
Improving Multimodal Fusion with Hierarchical Mutual Information Maximization for Multimodal Sentiment AnalysisCode1
Few-shot Multimodal Sentiment Analysis based on Multimodal Probabilistic Fusion PromptsCode1
Jointly Fine-Tuning “BERT-like” Self Supervised Models to Improve Multimodal Speech Emotion RecognitionCode1
A Transformer-based joint-encoding for Emotion Recognition and Sentiment AnalysisCode1
Learning Modality-Specific Representations with Self-Supervised Multi-Task Learning for Multimodal Sentiment AnalysisCode1
Efficient Multimodal Transformer with Dual-Level Feature Restoration for Robust Multimodal Sentiment AnalysisCode1
Bi-Bimodal Modality Fusion for Correlation-Controlled Multimodal Sentiment AnalysisCode1
Deep-HOSeq: Deep Higher Order Sequence Fusion for Multimodal Sentiment AnalysisCode1
Enhancing Multimodal Sentiment Analysis for Missing Modality through Self-Distillation and Unified Modality Cross-AttentionCode1
Bridging the Gap for Test-Time Multimodal Sentiment AnalysisCode1
Cross-Modal BERT for Text-Audio Sentiment AnalysisCode1
CH-SIMS: A Chinese Multimodal Sentiment Analysis Dataset with Fine-grained Annotation of ModalityCode1
Analyzing Modality Robustness in Multimodal Sentiment AnalysisCode1
Exploiting BERT For Multimodal Target Sentiment Classification Through Input Space TranslationCode1
CTFN: Hierarchical Learning for Multimodal Sentiment Analysis Using Coupled-Translation Fusion NetworkCode1
Show:102550
← PrevPage 1 of 9Next →

No leaderboard results yet.