SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 151175 of 2041 papers

TitleStatusHype
Partial Label Learning for Emotion Recognition from EEGCode1
ChatGPT: Jack of all trades, master of noneCode1
Cluster-Level Contrastive Learning for Emotion Recognition in ConversationsCode1
EmoGator: A New Open Source Vocal Burst Dataset with Baseline Machine Learning Classification MethodologiesCode1
Multivariate, Multi-Frequency and Multimodal: Rethinking Graph Neural Networks for Emotion Recognition in ConversationCode1
Large Raw Emotional Dataset with Aggregation MechanismCode1
A Persian ASR-based SER: Modification of Sharif Emotional Speech Database and Investigation of Persian Text CorporaCode1
YM2413-MDB: A Multi-Instrumental FM Video Game Music Dataset with Emotion AnnotationsCode1
Distribution-based Emotion Recognition in ConversationCode1
SPEAKER VGG CCT: Cross-corpus Speech Emotion Recognition with Speaker Embedding and Vision TransformersCode1
Multimodal Information Bottleneck: Learning Minimal Sufficient Unimodal and Multimodal RepresentationsCode1
Using Emotion Embeddings to Transfer Knowledge Between Emotions, Languages, and Annotation FormatsCode1
GM-TCNet: Gated Multi-scale Temporal Convolutional Network using Emotion Causality for Speech Emotion RecognitionCode1
Leveraging Label Correlations in a Multi-label Setting: A Case Study in EmotionCode1
Exploiting modality-invariant feature for robust multimodal emotion recognition with missing modalitiesCode1
Empathetic Dialogue Generation via Sensitive Emotion Recognition and Sensible Knowledge SelectionCode1
Supervised Prototypical Contrastive Learning for Emotion Recognition in ConversationCode1
In Search of a Robust Facial Expressions Recognition Model: A Large-Scale Visual Cross-Corpus StudyCode1
Cross Task Neural Architecture Search for EEG Signal ClassificationsCode1
MuCDN: Mutual Conversational Detachment Network for Emotion Recognition in Multi-Party ConversationsCode1
FV2ES: A Fully End2End Multimodal System for Fast Yet Effective Video Emotion Recognition InferenceCode1
Audio-Visual Fusion for Emotion Recognition in the Valence-Arousal Space Using Joint Cross-AttentionCode1
Non-Contrastive Self-Supervised Learning of Utterance-Level Speech RepresentationsCode1
Contextual Information and Commonsense Based Prompt for Emotion Recognition in ConversationCode1
GA2MIF: Graph and Attention Based Two-Stage Multi-Source Information Fusion for Conversational Emotion DetectionCode1
Show:102550
← PrevPage 7 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified