SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 151175 of 2041 papers

TitleStatusHype
Adaptive Progressive Attention Graph Neural Network for EEG Emotion Recognition0
OSUM: Advancing Open Speech Understanding Models with Limited Resources in AcademiaCode3
EmoTech: A Multi-modal Speech Emotion Recognition Using Multi-source Low-level Information with Hybrid Recurrent Network0
EmoFormer: A Text-Independent Speech Emotion Recognition using a Hybrid Transformer-CNN model0
Why disentanglement-based speaker anonymization systems fail at preserving emotions?0
Representation Learning with Parameterised Quantum Circuits for Advancing Speech Emotion Recognition0
LLM supervised Pre-training for Multimodal Emotion Recognition in Conversations0
Uncertainty Estimation in the Real World: A Study on Music Emotion Recognition0
Synthetic Data Generation by Supervised Neural Gas Network for Physiological Emotion Recognition DataCode1
AIMA at SemEval-2024 Task 10: History-Based Emotion Recognition in Hindi-English Code-Mixed Conversations0
Omni-Emotion: Extending Video MLLM with Detailed Face and Audio Modeling for Multimodal Emotion Analysis0
Deep Learning-Based Feature Fusion for Emotion Analysis and Suicide Risk Differentiation in Chinese Psychological Support HotlinesCode0
XMusic: Towards a Generalized and Controllable Symbolic Music Generation Framework0
Empathetic Conversational Agents: Utilizing Neural and Physiological Signals for Enhanced Empathetic Interactions0
EmoNeXt: an Adapted ConvNeXt for Facial Emotion RecognitionCode1
CG-MER: A Card Game-based Multimodal dataset for Emotion Recognition0
A Heterogeneous Multimodal Graph Learning Framework for Recognizing User Emotions in Social Networks0
Multi-face emotion detection for effective Human-Robot Interaction0
EmotiCrafter: Text-to-Emotional-Image Generation based on Valence-Arousal Model0
JELLY: Joint Emotion Recognition and Context Reasoning with LLMs for Conversational Speech Synthesis0
Leveraging Cross-Attention Transformer and Multi-Feature Fusion for Cross-Linguistic Speech Emotion Recognition0
MVP: Multimodal Emotion Recognition based on Video and Physiological Signals0
Fitting Different Interactive Information: Joint Classification of Emotion and Intention0
learning discriminative features from spectrograms using center loss for speech emotion recognition0
TED: Turn Emphasis with Dialogue Feature Attention for Emotion Recognition in Conversation0
Show:102550
← PrevPage 7 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified