SOTAVerified

Multimodal Sentiment Analysis

Multimodal sentiment analysis is the task of performing sentiment analysis with multiple data sources - e.g. a camera feed of someone's face and their recorded speech.

( Image credit: ICON: Interactive Conversational Memory Network for Multimodal Emotion Detection )

Papers

Showing 150 of 202 papers

TitleStatusHype
Sentiment Reasoning for HealthcareCode3
DLF: Disentangled-Language-Focused Multimodal Sentiment AnalysisCode2
Knowledge-Guided Dynamic Modality Attention Fusion Framework for Multimodal Sentiment AnalysisCode2
Towards Robust Multimodal Sentiment Analysis with Incomplete DataCode2
Recent Trends of Multimodal Affective Computing: A Survey from NLP PerspectiveCode2
Multimodal Prompt Learning with Missing Modalities for Sentiment Analysis and Emotion RecognitionCode2
UniMSE: Towards Unified Multimodal Sentiment Analysis and Emotion RecognitionCode2
MARLIN: Masked Autoencoder for facial video Representation LearnINgCode2
M-SENA: An Integrated Platform for Multimodal Sentiment AnalysisCode2
MSE-Adapter: A Lightweight Plugin Endowing LLMs with the Capability to Perform Multimodal Sentiment Analysis and Emotion RecognitionCode1
Multi-Modality Collaborative Learning for Sentiment AnalysisCode1
Semi-IIN: Semi-supervised Intra-inter modal Interaction Learning Network for Multimodal Sentiment AnalysisCode1
EmoVerse: Exploring Multimodal Large Language Models for Sentiment and Emotion UnderstandingCode1
Bridging the Gap for Test-Time Multimodal Sentiment AnalysisCode1
Enhancing Multimodal Sentiment Analysis for Missing Modality through Self-Distillation and Unified Modality Cross-AttentionCode1
Semantic-Guided Multimodal Sentiment Decoding with Adversarial Temporal-Invariant LearningCode1
Tracing Intricate Cues in Dialogue: Joint Graph Structure and Sentiment Dynamics for Multimodal Emotion RecognitionCode1
New Benchmark Dataset and Fine-Grained Cross-Modal Fusion Framework for Vietnamese Multimodal Aspect-Category Sentiment AnalysisCode1
Cooperative Sentiment Agents for Multimodal Sentiment AnalysisCode1
Learning Language-guided Adaptive Hyper-modality Representation for Multimodal Sentiment AnalysisCode1
Exchanging-based Multimodal Fusion with TransformerCode1
UniSA: Unified Generative Framework for Sentiment AnalysisCode1
Multimodal Multi-loss Fusion Network for Sentiment AnalysisCode1
The MuSe 2023 Multimodal Sentiment Analysis Challenge: Mimicked Emotions, Cross-Cultural Humour, and PersonalisationCode1
MELTR: Meta Loss Transformer for Learning to Fine-tune Video Foundation ModelsCode1
Few-shot Multimodal Sentiment Analysis based on Multimodal Probabilistic Fusion PromptsCode1
Multimodal Information Bottleneck: Learning Minimal Sufficient Unimodal and Multimodal RepresentationsCode1
Transfer Learning with Joint Fine-Tuning for Multimodal Sentiment AnalysisCode1
Towards Exploiting Sticker for Multimodal Sentiment Analysis in Social Media: A New Dataset and BaselineCode1
TVLT: Textless Vision-Language TransformerCode1
Make Acoustic and Visual Cues Matter: CH-SIMS v2.0 Dataset and AV-Mixup Consistent ModuleCode1
Efficient Multimodal Transformer with Dual-Level Feature Restoration for Robust Multimodal Sentiment AnalysisCode1
CubeMLP: An MLP-based Model for Multimodal Sentiment Analysis and Depression EstimationCode1
Counterfactual Reasoning for Out-of-distribution Multimodal Sentiment AnalysisCode1
The MuSe 2022 Multimodal Sentiment Analysis Challenge: Humor, Emotional Reactions, and StressCode1
Analyzing Modality Robustness in Multimodal Sentiment AnalysisCode1
Tag-assisted Multimodal Sentiment Analysis under Uncertain Missing ModalitiesCode1
Sentiment Word Aware Multimodal Refinement for Multimodal Sentiment Analysis with ASR ErrorsCode1
MMLatch: Bottom-up Top-down Fusion for Multimodal Sentiment AnalysisCode1
Trustworthy Multimodal Regression with Mixture of Normal-inverse Gamma DistributionsCode1
TEASEL: A Transformer-Based Speech-Prefixed Language ModelCode1
Improving Multimodal Fusion with Hierarchical Mutual Information Maximization for Multimodal Sentiment AnalysisCode1
Exploiting BERT For Multimodal Target Sentiment Classification Through Input Space TranslationCode1
CTFN: Hierarchical Learning for Multimodal Sentiment Analysis Using Coupled-Translation Fusion NetworkCode1
Bi-Bimodal Modality Fusion for Correlation-Controlled Multimodal Sentiment AnalysisCode1
MuSe-Toolbox: The Multimodal Sentiment Analysis Continuous Annotation Fusion and Discrete Class Transformation ToolboxCode1
The MuSe 2021 Multimodal Sentiment Analysis Challenge: Sentiment, Emotion, Physiological-Emotion, and StressCode1
Learning Modality-Specific Representations with Self-Supervised Multi-Task Learning for Multimodal Sentiment AnalysisCode1
MSAF: Multimodal Split Attention FusionCode1
SWAFN: Sentimental Words Aware Fusion Network for Multimodal Sentiment AnalysisCode1
Show:102550
← PrevPage 1 of 5Next →

No leaderboard results yet.