SOTAVerified

Multimodal Sentiment Analysis

Multimodal sentiment analysis is the task of performing sentiment analysis with multiple data sources - e.g. a camera feed of someone's face and their recorded speech.

( Image credit: ICON: Interactive Conversational Memory Network for Multimodal Emotion Detection )

Papers

Showing 101125 of 202 papers

TitleStatusHype
Multimodal Sentiment Analysis with Missing Modality: A Knowledge-Transfer Approach0
Explainable Multimodal Sentiment Analysis on Bengali Memes0
PowMix: A Versatile Regularizer for Multimodal Sentiment Analysis0
Multimodal Sentiment Analysis: Perceived vs Induced Sentiments0
Improving Multimodal Sentiment Analysis: Supervised Angular Margin-based Contrastive Learning for Enhanced Fusion Representation0
Unsupervised Graph Attention Autoencoder for Attributed Networks using K-means Loss0
Multi-label Emotion Analysis in Conversation via Multimodal Knowledge Distillation0
Robust Multimodal Learning with Missing Modalities via Parameter-Efficient Adaptation0
Exploiting Diverse Feature for Multimodal Sentiment Analysis0
General Debiasing for Multimodal Sentiment AnalysisCode0
ConKI: Contrastive Knowledge Injection for Multimodal Sentiment Analysis0
Modality Influence in Multimodal Machine Learning0
Towards Arabic Multimodal Dataset for Sentiment AnalysisCode0
Syntax-aware Hybrid prompt model for Few-shot multi-modal sentiment analysis0
Denoising Bottleneck with Mutual Information Maximization for Video Multimodal FusionCode0
Cross-Attention is Not Enough: Incongruity-Aware Dynamic Hierarchical Fusion for Multimodal Affect RecognitionCode0
Speech-Text Dialog Pre-training for Spoken Dialog Understanding with Explicit Cross-Modal Alignment0
Shared and Private Information Learning in Multimodal Sentiment Analysis with Deep Modal Alignment and Self-supervised Multi-Task Learning0
Multimodal Sentiment Analysis: A Survey0
Interpretable multimodal sentiment analysis based on textual modality descriptions by using large-scale language modelsCode0
TextMI: Textualize Multimodal Information for Integrating Non-verbal Cues in Pre-trained Language Models0
Exploring Multimodal Sentiment Analysis via CBAM Attention and Double-layer BiLSTM Architecture0
Curriculum Learning Meets Weakly Supervised Modality Correlation Learning0
A Self-Adjusting Fusion Representation Learning Model for Unaligned Text-Audio Sequences0
On the Use of Modality-Specific Large-Scale Pre-Trained Encoders for Multimodal Sentiment AnalysisCode0
Show:102550
← PrevPage 5 of 9Next →

No leaderboard results yet.