SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 451475 of 2041 papers

TitleStatusHype
Deep Learning-Based Feature Fusion for Emotion Analysis and Suicide Risk Differentiation in Chinese Psychological Support HotlinesCode0
XMusic: Towards a Generalized and Controllable Symbolic Music Generation Framework0
CG-MER: A Card Game-based Multimodal dataset for Emotion Recognition0
Empathetic Conversational Agents: Utilizing Neural and Physiological Signals for Enhanced Empathetic Interactions0
A Heterogeneous Multimodal Graph Learning Framework for Recognizing User Emotions in Social Networks0
Multi-face emotion detection for effective Human-Robot Interaction0
EmotiCrafter: Text-to-Emotional-Image Generation based on Valence-Arousal Model0
JELLY: Joint Emotion Recognition and Context Reasoning with LLMs for Conversational Speech Synthesis0
Leveraging Cross-Attention Transformer and Multi-Feature Fusion for Cross-Linguistic Speech Emotion Recognition0
MVP: Multimodal Emotion Recognition based on Video and Physiological Signals0
Fitting Different Interactive Information: Joint Classification of Emotion and Intention0
TED: Turn Emphasis with Dialogue Feature Attention for Emotion Recognition in Conversation0
Is It Still Fair? Investigating Gender Fairness in Cross-Corpus Speech Emotion Recognition0
learning discriminative features from spectrograms using center loss for speech emotion recognition0
Unsupervised Discovery of Facial Landmarks and Head Pose0
CocoER: Aligning Multi-Level Feature by Competition and Coordination for Emotion Recognition0
EMOE: Modality-Specific Enhanced Dynamic Emotion Experts0
Metadata-Enhanced Speech Emotion Recognition: Augmented Residual Integration and Co-Attention in Two-Stage Fine-Tuning0
Sample Correlation for Fingerprinting Deep Face RecognitionCode0
Enhancing Multimodal Emotion Recognition through Multi-Granularity Cross-Modal Alignment0
Mouth Articulation-Based Anchoring for Improved Cross-Corpus Speech Emotion Recognition0
A Multimodal Emotion Recognition System: Integrating Facial Expressions, Body Movement, Speech, and Spoken Language0
Effective Context Modeling Framework for Emotion Recognition in Conversations0
Spatio-Temporal Fuzzy-oriented Multi-Modal Meta-Learning for Fine-grained Emotion RecognitionCode0
Bridge then Begin Anew: Generating Target-relevant Intermediate Model for Source-free Visual Emotion AdaptationCode0
Show:102550
← PrevPage 19 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified