SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 401425 of 2041 papers

TitleStatusHype
GatedxLSTM: A Multimodal Affective Computing Approach for Emotion Recognition in Conversations0
Large Language Models Meet Contrastive Learning: Zero-Shot Emotion Recognition Across LanguagesCode0
Hierarchical Adaptive Expert for Multimodal Sentiment Analysis0
Deep Learning for Speech Emotion Recognition: A CNN Approach Utilizing Mel Spectrograms0
Modeling speech emotion with label variance and analyzing performance across speakers and unseen acoustic conditions0
Enhancing Multi-Label Emotion Analysis and Corresponding Intensities for Ethiopian Languages0
Coverage-Guaranteed Speech Emotion Recognition via Calibrated Uncertainty-Adaptive Prediction Sets0
FACE: Few-shot Adapter with Cross-view Fusion for Cross-subject EEG Emotion Recognition0
Feature-Based Dual Visual Feature Extraction Model for Compound Multimodal Emotion RecognitionCode0
Unifying EEG and Speech for Emotion Recognition: A Two-Step Joint Learning Framework for Handling Missing EEG Data During Inference0
Modelling Emotions in Face-to-Face Setting: The Interplay of Eye-Tracking, Personality, and Temporal Dynamics0
United we stand, Divided we fall: Handling Weak Complementary Relationships for Audio-Visual Emotion Recognition in Valence-Arousal Space0
Compound Expression Recognition via Large Vision-Language Models0
Mamba-VA: A Mamba-based Approach for Continuous Emotion Recognition in Valence-Arousal SpaceCode0
Emotion Recognition with CLIP and Sequential Learning0
Technical Approach for the EMI Challenge in the 8th Affective Behavior Analysis in-the-Wild Competition0
CULEMO: Cultural Lenses on Emotion -- Benchmarking LLMs for Cross-Cultural Emotion Understanding0
CALLM: Understanding Cancer Survivors' Emotions and Intervention Opportunities via Mobile Diaries and Context-Aware Language Models0
Synthetic Data Generation of Body Motion Data by Neural Gas Network for Emotion RecognitionCode0
Heterogeneous bimodal attention fusion for speech emotion recognition0
Multimodal Emotion Recognition and Sentiment Analysis in Multi-Party Conversation Contexts0
Bimodal Connection Attention Fusion for Speech Emotion Recognition0
Personalized Emotion Detection from Floor Vibrations Induced by Footsteps0
Qieemo: Speech Is All You Need in the Emotion Recognition in Conversations0
ECG-EmotionNet: Nested Mixture of Expert (NMoE) Adaptation of ECG-Foundation Model for Driver Emotion Recognition0
Show:102550
← PrevPage 17 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified