SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 5175 of 2041 papers

TitleStatusHype
The Super Emotion Dataset0
The Pursuit of Empathy: Evaluating Small Language Models for PTSD Dialogue Support0
Multimodal Mixture of Low-Rank Experts for Sentiment Analysis and Emotion Recognition0
Mitigating Subgroup Disparities in Multi-Label Speech Emotion Recognition: A Pseudo-Labeling and Unsupervised Learning Approach0
FEALLM: Advancing Facial Emotion Analysis in Multimodal Large Language Models with Emotional Synergy and ReasoningCode0
JNLP at SemEval-2025 Task 11: Cross-Lingual Multi-Label Emotion Detection Using Generative ModelsCode0
Emotion Recognition for Low-Resource Turkish: Fine-Tuning BERTurk on TREMO and Testing on Xenophobic Political Discourse0
Music Interpretation and Emotion Perception: A Computational and Neurophysiological Investigation0
CAMEO: Collection of Multilingual Emotional Speech Corpora0
Interpretable Multi-Task PINN for Emotion Recognition and EDA Prediction0
Evaluation in EEG Emotion Recognition: State-of-the-Art Review and Unified FrameworkCode1
Emotion Knowledge Enhancement for Vision Large Language Models: A Self-Verification Approach for High-Quality Emotion Instruction Data Generation0
GlobalMood: A cross-cultural benchmark for music emotion recognition0
Robust Emotion Recognition via Bi-Level Self-Supervised Continual Learning0
Empirical Analysis of Asynchronous Federated Learning on Heterogeneous Devices: Efficiency, Fairness, and Privacy Trade-offs0
TACFN: Transformer-based Adaptive Cross-modal Fusion Network for Multimodal Emotion RecognitionCode0
Emotion-Qwen: Training Hybrid Experts for Unified Emotion and General Vision-Language UnderstandingCode1
Artificial Behavior Intelligence: Technology, Challenges, and Future Directions0
VAEmo: Efficient Representation Learning for Visual-Audio Emotion with Knowledge InjectionCode0
Emotions in the Loop: A Survey of Affective Computing for Emotional Support0
BERSting at the Screams: A Benchmark for Distanced, Emotional and Shouted Speech RecognitionCode0
Spatiotemporal Emotional Synchrony in Dyadic Interactions: The Role of Speech Conditions in Facial and Vocal Affective Alignment0
Emotion Recognition in Contemporary Dance Performances Using Laban Movement Analysis0
DB-GNN: Dual-Branch Graph Neural Network with Multi-Level Contrastive Learning for Jointly Identifying Within- and Cross-Frequency Coupled Brain Networks0
Towards Robust Multimodal Physiological Foundation Models: Handling Arbitrary Missing Modalities0
Show:102550
← PrevPage 3 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified