SOTAVerified

Multimodal Sentiment Analysis

Multimodal sentiment analysis is the task of performing sentiment analysis with multiple data sources - e.g. a camera feed of someone's face and their recorded speech.

( Image credit: ICON: Interactive Conversational Memory Network for Multimodal Emotion Detection )

Papers

Showing 5160 of 202 papers

TitleStatusHype
Multimodal Emotion Recognition with Transformer-Based Self Supervised Feature FusionCode1
MTAG: Modal-Temporal Attention Graph for Unaligned Human Multimodal Language SequencesCode1
Deep-HOSeq: Deep Higher Order Sequence Fusion for Multimodal Sentiment AnalysisCode1
Cross-Modal BERT for Text-Audio Sentiment AnalysisCode1
Modulated Fusion using Transformer for Linguistic-Acoustic Emotion RecognitionCode1
Jointly Fine-Tuning “BERT-like” Self Supervised Models to Improve Multimodal Speech Emotion RecognitionCode1
Multilogue-Net: A Context-Aware RNN for Multi-modal Emotion Detection and Sentiment Analysis in ConversationCode1
CH-SIMS: A Chinese Multimodal Sentiment Analysis Dataset with Fine-grained Annotation of ModalityCode1
A Transformer-based joint-encoding for Emotion Recognition and Sentiment AnalysisCode1
MISA: Modality-Invariant and -Specific Representations for Multimodal Sentiment AnalysisCode1
Show:102550
← PrevPage 6 of 21Next →

No leaderboard results yet.