SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 14761500 of 2041 papers

TitleStatusHype
Unsupervised Multi-Modal Representation Learning for Affective Computing with Multi-Corpus Wearable Data0
A Efficient Multimodal Framework for Large Scale Emotion Recognition by Fusing Music and Electrodermal Activity SignalsCode1
EmoGraph: Capturing Emotion Correlations using Graph Networks0
Spatio-Temporal EEG Representation Learning on Riemannian Manifold and Euclidean SpaceCode1
Emotion Carrier Recognition from Personal Narratives0
Hey Human, If your Facial Emotions are Uncertain, You Should Use Bayesian Neural Networks!0
Jointly Fine-Tuning “BERT-like” Self Supervised Models to Improve Multimodal Speech Emotion RecognitionCode1
SemEval-2020 Task 8: Memotion Analysis -- The Visuo-Lingual Metaphor!Code1
Speech Driven Talking Face Generation from a Single Image and an Emotion ConditionCode1
A Transfer Learning Method for Speech Emotion Recognition from Automatic Speech Recognition0
Dynamic Emotion Modeling with Learnable Graphs and Graph Inception NetworkCode1
Compact Graph Architecture for Speech Emotion RecognitionCode1
The BIRAFFE2 Experiment. Study in Bio-Reactions and Faces for Emotion-based Personalization for AI Systems0
DSC IIT-ISM at SemEval-2020 Task 8: Bi-Fusion Techniques for Deep Meme Emotion AnalysisCode0
YNU-HPCC at SemEval-2020 Task 8: Using a Parallel-Channel Model for Memotion AnalysisCode0
Emotion Correlation Mining Through Deep Learning Models on Natural Language Text0
Variants of BERT, Random Forests and SVM approach for Multimodal Emotion-Target Sub-challenge0
HEU Emotion: A Large-scale Database for Multi-modal Emotion Recognition in the Wild0
Generative Adversarial Stacked Autoencoders for Facial Pose Normalization and Emotion Recognition0
COVID-19 Twitter Dataset with Latent Topics, Sentiments and Emotions AttributesCode0
TCGM: An Information-Theoretic Framework for Semi-Supervised Multi-Modality Learning0
Temporal aggregation of audio-visual modalities for emotion recognition0
Low Rank Fusion based Transformers for Multimodal Sequences0
Shallow over Deep Neural Networks: A empirical analysis for human emotion classification using audio data0
TRANSFER :- DEEP INDUCTIVE NETWORK FOR FACIAL EMOTION RECOGNITIONCode0
Show:102550
← PrevPage 60 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified