SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 101125 of 2041 papers

TitleStatusHype
Deep Learning for Speech Emotion Recognition: A CNN Approach Utilizing Mel Spectrograms0
Coverage-Guaranteed Speech Emotion Recognition via Calibrated Uncertainty-Adaptive Prediction Sets0
Modeling speech emotion with label variance and analyzing performance across speakers and unseen acoustic conditions0
FACE: Few-shot Adapter with Cross-view Fusion for Cross-subject EEG Emotion Recognition0
Enhancing Multi-Label Emotion Analysis and Corresponding Intensities for Ethiopian Languages0
Feature-Based Dual Visual Feature Extraction Model for Compound Multimodal Emotion RecognitionCode0
Unifying EEG and Speech for Emotion Recognition: A Two-Step Joint Learning Framework for Handling Missing EEG Data During Inference0
Modelling Emotions in Face-to-Face Setting: The Interplay of Eye-Tracking, Personality, and Temporal Dynamics0
MAVEN: Multi-modal Attention for Valence-Arousal Emotion NetworkCode1
United we stand, Divided we fall: Handling Weak Complementary Relationships for Audio-Visual Emotion Recognition in Valence-Arousal Space0
Prosody-Enhanced Acoustic Pre-training and Acoustic-Disentangled Prosody Adapting for Movie DubbingCode1
Compound Expression Recognition via Large Vision-Language Models0
Mamba-VA: A Mamba-based Approach for Continuous Emotion Recognition in Valence-Arousal SpaceCode0
Technical Approach for the EMI Challenge in the 8th Affective Behavior Analysis in-the-Wild Competition0
Emotion Recognition with CLIP and Sequential Learning0
CULEMO: Cultural Lenses on Emotion -- Benchmarking LLMs for Cross-Cultural Emotion Understanding0
CALLM: Understanding Cancer Survivors' Emotions and Intervention Opportunities via Mobile Diaries and Context-Aware Language Models0
Synthetic Data Generation of Body Motion Data by Neural Gas Network for Emotion RecognitionCode0
Multimodal Emotion Recognition and Sentiment Analysis in Multi-Party Conversation Contexts0
Heterogeneous bimodal attention fusion for speech emotion recognition0
Bimodal Connection Attention Fusion for Speech Emotion Recognition0
R1-Omni: Explainable Omni-Multimodal Emotion Recognition with Reinforcement LearningCode5
Personalized Emotion Detection from Floor Vibrations Induced by Footsteps0
Qieemo: Speech Is All You Need in the Emotion Recognition in Conversations0
ECG-EmotionNet: Nested Mixture of Expert (NMoE) Adaptation of ECG-Foundation Model for Driver Emotion Recognition0
Show:102550
← PrevPage 5 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified