SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 76100 of 2041 papers

TitleStatusHype
Deep Multilayer Perceptrons for Dimensional Speech Emotion RecognitionCode1
Contextual Information and Commonsense Based Prompt for Emotion Recognition in ConversationCode1
CoMPM: Context Modeling with Speaker’s Pre-trained Memory Tracking for Emotion Recognition in ConversationCode1
Continuous Emotion Recognition using Visual-audio-linguistic information: A Technical Report for ABAW3Code1
Codified audio language modeling learns useful representations for music information retrievalCode1
A Facial Expression-Aware Multimodal Multi-task Learning Framework for Emotion Recognition in Multi-party ConversationsCode1
COGMEN: COntextualized GNN based Multimodal Emotion recognitioNCode1
Affective Behaviour Analysis Using Pretrained Model with Facial PrioriCode1
CoMPM: Context Modeling with Speaker's Pre-trained Memory Tracking for Emotion Recognition in ConversationCode1
Context Based Emotion Recognition using EMOTIC DatasetCode1
Context De-confounded Emotion RecognitionCode1
A Japanese Dataset for Subjective and Objective Sentiment Polarity Classification in Micro Blog DomainCode1
How to Enhance Causal Discrimination of Utterances: A Case on Affective ReasoningCode1
A Efficient Multimodal Framework for Large Scale Emotion Recognition by Fusing Music and Electrodermal Activity SignalsCode1
A Joint Cross-Attention Model for Audio-Visual Fusion in Dimensional Emotion RecognitionCode1
Cross Attentional Audio-Visual Fusion for Dimensional Emotion RecognitionCode1
Cross-Lingual Cross-Age Group Adaptation for Low-Resource Elderly Speech Emotion RecognitionCode1
Compact Graph Architecture for Speech Emotion RecognitionCode1
Curriculum Learning Meets Directed Acyclic Graph for Multimodal Emotion RecognitionCode1
Contrast and Generation Make BART a Good Dialogue Emotion RecognizerCode1
DialogXL: All-in-One XLNet for Multi-Party Conversation Emotion RecognitionCode1
A Hierarchical Transformer with Speaker Modeling for Emotion Recognition in ConversationCode1
Disentangled Variational Autoencoder for Emotion Recognition in ConversationsCode1
DialogueCRN: Contextual Reasoning Networks for Emotion Recognition in ConversationsCode1
ADVISER: A Toolkit for Developing Multi-modal, Multi-domain and Socially-engaged Conversational AgentsCode1
Show:102550
← PrevPage 4 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified