SOTAVerified

Multimodal Sentiment Analysis

Multimodal sentiment analysis is the task of performing sentiment analysis with multiple data sources - e.g. a camera feed of someone's face and their recorded speech.

( Image credit: ICON: Interactive Conversational Memory Network for Multimodal Emotion Detection )

Papers

Showing 51100 of 202 papers

TitleStatusHype
MSE-Adapter: A Lightweight Plugin Endowing LLMs with the Capability to Perform Multimodal Sentiment Analysis and Emotion RecognitionCode1
The MuSe 2022 Multimodal Sentiment Analysis Challenge: Humor, Emotional Reactions, and StressCode1
Exploiting BERT For Multimodal Target Sentiment Classification Through Input Space TranslationCode1
MuSe 2020 -- The First International Multimodal Sentiment Analysis in Real-life Media Challenge and WorkshopCode1
Sentiment Word Aware Multimodal Refinement for Multimodal Sentiment Analysis with ASR ErrorsCode1
CTFN: Hierarchical Learning for Multimodal Sentiment Analysis Using Coupled-Translation Fusion NetworkCode1
Transfer Learning with Joint Fine-Tuning for Multimodal Sentiment AnalysisCode1
Cooperative Sentiment Agents for Multimodal Sentiment AnalysisCode1
EmoVerse: Exploring Multimodal Large Language Models for Sentiment and Emotion UnderstandingCode1
Multi-Modality Collaborative Learning for Sentiment AnalysisCode1
Multimodal Information Bottleneck: Learning Minimal Sufficient Unimodal and Multimodal RepresentationsCode1
Cross-Modal BERT for Text-Audio Sentiment AnalysisCode1
Generalizable Multi-linear Attention Network0
Correlation-Decoupled Knowledge Distillation for Multimodal Sentiment Analysis with Incomplete Modalities0
Gated Mechanism for Attention Based Multimodal Sentiment Analysis0
CorMulT: A Semi-supervised Modality Correlation-aware Multimodal Transformer for Sentiment Analysis0
Findings of the Shared Task on Multimodal Sentiment Analysis and Troll Meme Classification in Dravidian Languages0
Exploring Multimodal Sentiment Analysis via CBAM Attention and Double-layer BiLSTM Architecture0
Exploring Large Language Models for Multimodal Sentiment Analysis: Challenges, Benchmarks, and Future Directions0
A Text-Centered Shared-Private Framework via Cross-Modal Prediction for Multimodal Sentiment Analysis0
A Multimodal Sentiment Dataset for Video Recommendation0
Exploiting Diverse Feature for Multimodal Sentiment Analysis0
Contextual Augmented Global Contrast for Multimodal Intent Recognition0
Explainable Multimodal Sentiment Analysis on Bengali Memes0
A Self-Adjusting Fusion Representation Learning Model for Unaligned Text-Audio Sequences0
A Novel Context-Aware Multimodal Framework for Persian Sentiment Analysis0
Enriching Multimodal Sentiment Analysis through Textual Emotional Descriptions of Visual-Audio Content0
Modeling Intra- and Inter-Modal Relations: Hierarchical Graph Contrastive Learning for Multimodal Sentiment Analysis0
ConKI: Contrastive Knowledge Injection for Multimodal Sentiment Analysis0
Modality Influence in Multimodal Machine Learning0
Missing Modality meets Meta Sampling (M3S): An Efficient Universal Approach for Multimodal Sentiment Analysis with Missing Modality0
Emoji Driven Crypto Assets Market Reactions0
CMSBERT-CLR: Context-driven Modality Shifting BERT with Contrastive Learning for linguistic, visual, acoustic Representations0
AMOA: Global Acoustic Feature Enhanced Modal-Order-Aware Network for Multimodal Sentiment Analysis0
Multimodal Language Analysis in the Wild: CMU-MOSEI Dataset and Interpretable Dynamic Fusion Graph0
Dynamic Multimodal Sentiment Analysis: Leveraging Cross-Modal Attention for Enabled Classification0
Adversarial Multimodal Domain Transfer for Video-Level Sentiment Analysis0
DravidianMultiModality: A Dataset for Multi-modal Sentiment Analysis in Tamil and Malayalam0
MART: Masked Affective RepresenTation Learning via Masked Temporal Distribution Distillation0
An Empirical Study on Configuring In-Context Learning Demonstrations for Unleashing MLLMs' Sentimental Perception Capability0
DNN Multimodal Fusion Techniques for Predicting Video Sentiment0
Multimodal Contrastive Learning via Uni-Modal Coding and Cross-Modal Prediction for Multimodal Sentiment Analysis0
Meta-Learn Unimodal Signals with Weak Supervision for Multimodal Sentiment Analysis0
M2Lens: Visualizing and Explaining Multimodal Models for Sentiment Analysis0
Lightweight Models for Multimodal Sequential Data0
An AutoML-based Approach to Multimodal Image Sentiment Analysis0
Multi-channel Attentive Graph Convolutional Network With Sentiment Fusion For Multimodal Sentiment Analysis0
Modality-Invariant Bidirectional Temporal Representation Distillation Network for Missing Multimodal Sentiment Analysis0
Learning Robust Joint Representations for Multimodal Sentiment Analysis0
Learning Relationships between Text, Audio, and Video via Deep Canonical Correlation for Multimodal Language Analysis0
Show:102550
← PrevPage 2 of 5Next →

No leaderboard results yet.