SOTAVerified

Multimodal Sentiment Analysis

Multimodal sentiment analysis is the task of performing sentiment analysis with multiple data sources - e.g. a camera feed of someone's face and their recorded speech.

( Image credit: ICON: Interactive Conversational Memory Network for Multimodal Emotion Detection )

Papers

Showing 150 of 202 papers

TitleStatusHype
Sentiment Reasoning for HealthcareCode3
Towards Robust Multimodal Sentiment Analysis with Incomplete DataCode2
DLF: Disentangled-Language-Focused Multimodal Sentiment AnalysisCode2
Multimodal Prompt Learning with Missing Modalities for Sentiment Analysis and Emotion RecognitionCode2
Knowledge-Guided Dynamic Modality Attention Fusion Framework for Multimodal Sentiment AnalysisCode2
UniMSE: Towards Unified Multimodal Sentiment Analysis and Emotion RecognitionCode2
Recent Trends of Multimodal Affective Computing: A Survey from NLP PerspectiveCode2
M-SENA: An Integrated Platform for Multimodal Sentiment AnalysisCode2
MARLIN: Masked Autoencoder for facial video Representation LearnINgCode2
Semi-IIN: Semi-supervised Intra-inter modal Interaction Learning Network for Multimodal Sentiment AnalysisCode1
New Benchmark Dataset and Fine-Grained Cross-Modal Fusion Framework for Vietnamese Multimodal Aspect-Category Sentiment AnalysisCode1
Make Acoustic and Visual Cues Matter: CH-SIMS v2.0 Dataset and AV-Mixup Consistent ModuleCode1
Semantic-Guided Multimodal Sentiment Decoding with Adversarial Temporal-Invariant LearningCode1
MMLatch: Bottom-up Top-down Fusion for Multimodal Sentiment AnalysisCode1
MELTR: Meta Loss Transformer for Learning to Fine-tune Video Foundation ModelsCode1
Sentiment Word Aware Multimodal Refinement for Multimodal Sentiment Analysis with ASR ErrorsCode1
Multi-Modality Collaborative Learning for Sentiment AnalysisCode1
Multimodal Emotion Recognition with Transformer-Based Self Supervised Feature FusionCode1
Learning Language-guided Adaptive Hyper-modality Representation for Multimodal Sentiment AnalysisCode1
Multimodal Information Bottleneck: Learning Minimal Sufficient Unimodal and Multimodal RepresentationsCode1
Improving Multimodal Fusion with Hierarchical Mutual Information Maximization for Multimodal Sentiment AnalysisCode1
Multimodal Multi-loss Fusion Network for Sentiment AnalysisCode1
Efficient Multimodal Transformer with Dual-Level Feature Restoration for Robust Multimodal Sentiment AnalysisCode1
EmoVerse: Exploring Multimodal Large Language Models for Sentiment and Emotion UnderstandingCode1
MSAF: Multimodal Split Attention FusionCode1
MuSe 2020 -- The First International Multimodal Sentiment Analysis in Real-life Media Challenge and WorkshopCode1
Modulated Fusion using Transformer for Linguistic-Acoustic Emotion RecognitionCode1
Cooperative Sentiment Agents for Multimodal Sentiment AnalysisCode1
A Transformer-based joint-encoding for Emotion Recognition and Sentiment AnalysisCode1
MISA: Modality-Invariant and -Specific Representations for Multimodal Sentiment AnalysisCode1
MTAG: Modal-Temporal Attention Graph for Unaligned Human Multimodal Language SequencesCode1
CTFN: Hierarchical Learning for Multimodal Sentiment Analysis Using Coupled-Translation Fusion NetworkCode1
Analyzing Modality Robustness in Multimodal Sentiment AnalysisCode1
Multilogue-Net: A Context Aware RNN for Multi-modal Emotion Detection and Sentiment Analysis in ConversationCode1
Exchanging-based Multimodal Fusion with TransformerCode1
CubeMLP: An MLP-based Model for Multimodal Sentiment Analysis and Depression EstimationCode1
Few-shot Multimodal Sentiment Analysis based on Multimodal Probabilistic Fusion PromptsCode1
Cross-Modal BERT for Text-Audio Sentiment AnalysisCode1
Deep-HOSeq: Deep Higher Order Sequence Fusion for Multimodal Sentiment AnalysisCode1
Bi-Bimodal Modality Fusion for Correlation-Controlled Multimodal Sentiment AnalysisCode1
Bridging the Gap for Test-Time Multimodal Sentiment AnalysisCode1
Learning Modality-Specific Representations with Self-Supervised Multi-Task Learning for Multimodal Sentiment AnalysisCode1
Tracing Intricate Cues in Dialogue: Joint Graph Structure and Sentiment Dynamics for Multimodal Emotion RecognitionCode1
Counterfactual Reasoning for Out-of-distribution Multimodal Sentiment AnalysisCode1
CH-SIMS: A Chinese Multimodal Sentiment Analysis Dataset with Fine-grained Annotation of ModalityCode1
Jointly Fine-Tuning “BERT-like” Self Supervised Models to Improve Multimodal Speech Emotion RecognitionCode1
Exploiting BERT For Multimodal Target Sentiment Classification Through Input Space TranslationCode1
MSE-Adapter: A Lightweight Plugin Endowing LLMs with the Capability to Perform Multimodal Sentiment Analysis and Emotion RecognitionCode1
Enhancing Multimodal Sentiment Analysis for Missing Modality through Self-Distillation and Unified Modality Cross-AttentionCode1
Multilogue-Net: A Context-Aware RNN for Multi-modal Emotion Detection and Sentiment Analysis in ConversationCode1
Show:102550
← PrevPage 1 of 5Next →

No leaderboard results yet.