SOTAVerified

Multimodal Sentiment Analysis

Multimodal sentiment analysis is the task of performing sentiment analysis with multiple data sources - e.g. a camera feed of someone's face and their recorded speech.

( Image credit: ICON: Interactive Conversational Memory Network for Multimodal Emotion Detection )

Papers

Showing 4150 of 202 papers

TitleStatusHype
Bridging the Gap for Test-Time Multimodal Sentiment AnalysisCode1
Learning Modality-Specific Representations with Self-Supervised Multi-Task Learning for Multimodal Sentiment AnalysisCode1
Tracing Intricate Cues in Dialogue: Joint Graph Structure and Sentiment Dynamics for Multimodal Emotion RecognitionCode1
Counterfactual Reasoning for Out-of-distribution Multimodal Sentiment AnalysisCode1
CH-SIMS: A Chinese Multimodal Sentiment Analysis Dataset with Fine-grained Annotation of ModalityCode1
Jointly Fine-Tuning “BERT-like” Self Supervised Models to Improve Multimodal Speech Emotion RecognitionCode1
Exploiting BERT For Multimodal Target Sentiment Classification Through Input Space TranslationCode1
MSE-Adapter: A Lightweight Plugin Endowing LLMs with the Capability to Perform Multimodal Sentiment Analysis and Emotion RecognitionCode1
Enhancing Multimodal Sentiment Analysis for Missing Modality through Self-Distillation and Unified Modality Cross-AttentionCode1
Multilogue-Net: A Context-Aware RNN for Multi-modal Emotion Detection and Sentiment Analysis in ConversationCode1
Show:102550
← PrevPage 5 of 21Next →

No leaderboard results yet.