SOTAVerified

Multimodal Sentiment Analysis

Multimodal sentiment analysis is the task of performing sentiment analysis with multiple data sources - e.g. a camera feed of someone's face and their recorded speech.

( Image credit: ICON: Interactive Conversational Memory Network for Multimodal Emotion Detection )

Papers

Showing 161170 of 202 papers

TitleStatusHype
Modulated Fusion using Transformer for Linguistic-Acoustic Emotion RecognitionCode1
Multimodal Sentiment Analysis with Multi-perspective Fusion Network Focusing on Sense Attentive Language0
TransModality: An End2End Fusion Method with Transformer for Multimodal Sentiment Analysis0
Jointly Fine-Tuning “BERT-like” Self Supervised Models to Improve Multimodal Speech Emotion RecognitionCode1
TPFN: Applying Outer Product along Time to Multimodal Sentiment Analysis Fusion on Incomplete Data0
Multilogue-Net: A Context-Aware RNN for Multi-modal Emotion Detection and Sentiment Analysis in ConversationCode1
CH-SIMS: A Chinese Multimodal Sentiment Analysis Dataset with Fine-grained Annotation of ModalityCode1
A Transformer-based joint-encoding for Emotion Recognition and Sentiment AnalysisCode1
MISA: Modality-Invariant and -Specific Representations for Multimodal Sentiment AnalysisCode1
MuSe 2020 -- The First International Multimodal Sentiment Analysis in Real-life Media Challenge and WorkshopCode1
Show:102550
← PrevPage 17 of 21Next →

No leaderboard results yet.