SOTAVerified

Multimodal Sentiment Analysis

Multimodal sentiment analysis is the task of performing sentiment analysis with multiple data sources - e.g. a camera feed of someone's face and their recorded speech.

( Image credit: ICON: Interactive Conversational Memory Network for Multimodal Emotion Detection )

Papers

Showing 2650 of 202 papers

TitleStatusHype
MuSe 2020 -- The First International Multimodal Sentiment Analysis in Real-life Media Challenge and WorkshopCode1
Modulated Fusion using Transformer for Linguistic-Acoustic Emotion RecognitionCode1
Cooperative Sentiment Agents for Multimodal Sentiment AnalysisCode1
A Transformer-based joint-encoding for Emotion Recognition and Sentiment AnalysisCode1
MISA: Modality-Invariant and -Specific Representations for Multimodal Sentiment AnalysisCode1
MTAG: Modal-Temporal Attention Graph for Unaligned Human Multimodal Language SequencesCode1
CTFN: Hierarchical Learning for Multimodal Sentiment Analysis Using Coupled-Translation Fusion NetworkCode1
Analyzing Modality Robustness in Multimodal Sentiment AnalysisCode1
Multilogue-Net: A Context Aware RNN for Multi-modal Emotion Detection and Sentiment Analysis in ConversationCode1
Exchanging-based Multimodal Fusion with TransformerCode1
CubeMLP: An MLP-based Model for Multimodal Sentiment Analysis and Depression EstimationCode1
Few-shot Multimodal Sentiment Analysis based on Multimodal Probabilistic Fusion PromptsCode1
Cross-Modal BERT for Text-Audio Sentiment AnalysisCode1
Deep-HOSeq: Deep Higher Order Sequence Fusion for Multimodal Sentiment AnalysisCode1
Bi-Bimodal Modality Fusion for Correlation-Controlled Multimodal Sentiment AnalysisCode1
Bridging the Gap for Test-Time Multimodal Sentiment AnalysisCode1
Learning Modality-Specific Representations with Self-Supervised Multi-Task Learning for Multimodal Sentiment AnalysisCode1
Tracing Intricate Cues in Dialogue: Joint Graph Structure and Sentiment Dynamics for Multimodal Emotion RecognitionCode1
Counterfactual Reasoning for Out-of-distribution Multimodal Sentiment AnalysisCode1
CH-SIMS: A Chinese Multimodal Sentiment Analysis Dataset with Fine-grained Annotation of ModalityCode1
Jointly Fine-Tuning “BERT-like” Self Supervised Models to Improve Multimodal Speech Emotion RecognitionCode1
Exploiting BERT For Multimodal Target Sentiment Classification Through Input Space TranslationCode1
MSE-Adapter: A Lightweight Plugin Endowing LLMs with the Capability to Perform Multimodal Sentiment Analysis and Emotion RecognitionCode1
Enhancing Multimodal Sentiment Analysis for Missing Modality through Self-Distillation and Unified Modality Cross-AttentionCode1
Multilogue-Net: A Context-Aware RNN for Multi-modal Emotion Detection and Sentiment Analysis in ConversationCode1
Show:102550
← PrevPage 2 of 9Next →

No leaderboard results yet.