SOTAVerified

Multimodal Sentiment Analysis

Multimodal sentiment analysis is the task of performing sentiment analysis with multiple data sources - e.g. a camera feed of someone's face and their recorded speech.

( Image credit: ICON: Interactive Conversational Memory Network for Multimodal Emotion Detection )

Papers

Showing 5160 of 202 papers

TitleStatusHype
Exchanging-based Multimodal Fusion with TransformerCode1
MMLatch: Bottom-up Top-down Fusion for Multimodal Sentiment AnalysisCode1
Exploiting BERT For Multimodal Target Sentiment Classification Through Input Space TranslationCode1
Modulated Fusion using Transformer for Linguistic-Acoustic Emotion RecognitionCode1
Multilogue-Net: A Context-Aware RNN for Multi-modal Emotion Detection and Sentiment Analysis in ConversationCode1
MTAG: Modal-Temporal Attention Graph for Unaligned Human Multimodal Language SequencesCode1
Few-shot Multimodal Sentiment Analysis based on Multimodal Probabilistic Fusion PromptsCode1
Cooperative Sentiment Agents for Multimodal Sentiment AnalysisCode1
Jointly Fine-Tuning “BERT-like” Self Supervised Models to Improve Multimodal Speech Emotion RecognitionCode1
TEASEL: A Transformer-Based Speech-Prefixed Language ModelCode1
Show:102550
← PrevPage 6 of 21Next →

No leaderboard results yet.