SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 926950 of 2041 papers

TitleStatusHype
Facial Emotions Recognition using Convolutional Neural Net0
Dynamic Graph Neural ODE Network for Multi-modal Emotion Recognition in Conversation0
Facial Expression Recognition and Image Description Generation in Vietnamese0
Adaptive Fusion Techniques for Multimodal Data0
Facial expression recognition based on local region specific features and support vector machines0
Facial Geometric Feature Extraction for Dimensional Emotion Analysis Using Genetic Programming0
FAF: A novel multimodal emotion recognition approach integrating face, body and text0
Best Practices for Noise-Based Augmentation to Improve the Performance of Deployable Speech-Based Emotion Recognition Systems0
Fast Facial Landmark Detection and Applications: A Survey0
A Survey on Physiological Signal Based Emotion Recognition0
Dynamic Facial Expression Generation on Hilbert Hypersphere with Conditional Wasserstein Generative Adversarial Nets0
Dynamic Causal Disentanglement Model for Dialogue Emotion Detection0
ICON: Interactive Conversational Memory Network for Multimodal Emotion Detection0
Cross-modal Context Fusion and Adaptive Graph Convolutional Network for Multimodal Conversational Emotion Recognition0
BERT-ERC: Fine-tuning BERT is Enough for Emotion Recognition in Conversation0
Feature-level and Model-level Audiovisual Fusion for Emotion Recognition in the Wild0
Feature Selection Approaches for Optimising Music Emotion Recognition Methods0
Feature Selection Enhancement and Feature Space Visualization for Speech-Based Emotion Recognition0
An Architecture for Accelerated Large-Scale Inference of Transformer-Based Language Models0
Cross-Task Inconsistency Based Active Learning (CTIAL) for Emotion Recognition0
A Survey on Speech Large Language Models0
Fusing ASR Outputs in Joint Training for Speech Emotion Recognition0
Feelings from the Past---Adapting Affective Lexicons for Historical Emotion Analysis0
Technical Approach for the EMI Challenge in the 8th Affective Behavior Analysis in-the-Wild Competition0
An Approach for Improving Automatic Mouth Emotion Recognition0
Show:102550
← PrevPage 38 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified