SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 16011625 of 2041 papers

TitleStatusHype
Adaptive Fusion Techniques for Multimodal Data0
M3ER: Multiplicative Multimodal Emotion Recognition Using Facial, Textual, and Speech Cues0
Addressing Ambiguity of Emotion Labels Through Meta-Learning0
Speaker-invariant Affective Representation Learning via Adversarial Training0
An Affective Situation Labeling System from Psychological Behaviors in Emotion Recognition0
Context-aware Interactive Attention for Multi-modal Sentiment and Emotion Analysis0
Modeling Feature Representations for Affective Speech using Generative Adversarial Networks0
Privacy Enhanced Multimodal Neural Representations for Emotion Recognition0
DENS: A Dataset for Multi-class Emotion Analysis0
AI in Pursuit of Happiness, Finding Only Sadness: Multi-Modal Facial Emotion Recognition Challenge0
Domain adversarial learning for emotion recognition0
Unsupervised Representation Learning with Future Observation Prediction for Speech Emotion Recognition0
Multi-label Co-regularization for Semi-supervised Facial Action Unit RecognitionCode0
Conversational Emotion Analysis via Attention Mechanisms0
Emotion recognition with 4kresolution database0
Expression Analysis Based on Face Regions in Read-world Conditions0
Speech Emotion Recognition via Contrastive Loss under Siamese Networks0
Spatiotemporal Emotion Recognition using Deep CNN Based on EEG during Music ListeningCode0
Speech Emotion Recognition with Dual-Sequence LSTM Architecture0
PT-CoDE: Pre-trained Context-Dependent Encoder for Utterance-level Emotion RecognitionCode1
Facial Emotion Recognition Using Deep Learning0
Indian EmoSpeech Command Dataset: A dataset for emotion based speech recognition in the wildCode0
Face Behavior a la carte: Expressions, Affect and Action Units in a Single Network0
Self-supervised Learning for ECG-based Emotion RecognitionCode1
Interpretable Deep Neural Networks for Facial Expression and Dimensional Emotion Recognition in-the-wild0
Show:102550
← PrevPage 65 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified