SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 301325 of 2041 papers

TitleStatusHype
Facial Emotion Recognition Using Transfer Learning in the Deep CNNCode1
Facial Emotion Recognition with Noisy Multi-task AnnotationsCode1
Disentangled Variational Autoencoder for Emotion Recognition in ConversationsCode1
Frame-level emotional state alignment method for speech emotion recognitionCode1
Emotion-Cause Pair Extraction: A New Task to Emotion Analysis in TextsCode1
FV2ES: A Fully End2End Multimodal System for Fast Yet Effective Video Emotion Recognition InferenceCode1
BHAAV- A Text Corpus for Emotion Analysis from Hindi StoriesCode1
GiMeFive: Towards Interpretable Facial Emotion ClassificationCode1
GMSS: Graph-Based Multi-Task Self-Supervised Learning for EEG Emotion RecognitionCode1
GM-TCNet: Gated Multi-scale Temporal Convolutional Network using Emotion Causality for Speech Emotion RecognitionCode1
iMiGUE: An Identity-free Video Dataset for Micro-Gesture Understanding and Emotion AnalysisCode1
Group Gated Fusion on Attention-based Bidirectional Alignment for Multimodal Emotion RecognitionCode1
Multimodal Information Bottleneck: Learning Minimal Sufficient Unimodal and Multimodal RepresentationsCode1
BiosERC: Integrating Biography Speakers Supported by LLMs for ERC TasksCode1
AttX: Attentive Cross-Connections for Fusion of Wearable Signals in Emotion Recognition0
Attributes-aware Visual Emotion Representation Learning0
A Multimodal Emotion Recognition System: Integrating Facial Expressions, Body Movement, Speech, and Spoken Language0
Addressing Racial Bias in Facial Emotion Recognition0
CAMEO: Collection of Multilingual Emotional Speech Corpora0
COIN: Conversational Interactive Networks for Emotion Recognition in Conversation0
Attentive Cross-modal Connections for Deep Multimodal Wearable-based Emotion Recognition0
Attentive Convolutional Neural Network based Speech Emotion Recognition: A Study on the Impact of Input Features, Signal Length, and Acted Speech0
Attention Driven Fusion for Multi-Modal Emotion Recognition0
Attention-based Region of Interest (ROI) Detection for Speech Emotion Recognition0
A Multimodal Approach towards Emotion Recognition of Music using Audio and Lyrical Content0
Show:102550
← PrevPage 13 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified