SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 15761600 of 2041 papers

TitleStatusHype
Deep Representation Learning in Speech Processing: Challenges, Recent Advances, and Future Trends0
Ensemble emotion recognizing with multiple modal physiological signals0
Learning Transferable Features for Speech Emotion Recognition0
Emotion Recognition from SpeechCode0
Emotion Recognition Using Wearables: A Systematic Literature Review Work in progress0
Context-Dependent Models for Predicting and Characterizing Facial Expressiveness0
End-to-end facial and physiological model for Affective Computing and applications0
Women in ISIS Propaganda: A Natural Language Processing Analysis of Topics and Emotions in a Comparison with Mainstream Religious Group0
GoodNewsEveryone: A Corpus of News Headlines Annotated with Emotions, Semantic Roles, and Reader Perception0
Learning Word Ratings for Empathy and Distress from Document-Level User Responses0
EDA: Enriching Emotional Dialogue Acts using an Ensemble of Neural AnnotatorsCode0
Converting Sentiment Annotated Data to Emotion Annotated Data0
Bimodal Speech Emotion Recognition Using Pre-Trained Language Models0
Attentive Modality Hopping Mechanism for Speech Emotion RecognitionCode0
Multimodal Affective States Recognition Based on Multiscale CNNs and Biologically Inspired Decision Fusion Model0
Emotion helps Sentiment: A Multi-task Model for Sentiment and Emotion Analysis0
A Time Series Analysis of Emotional Loading in Central Bank Statements0
Modeling emotion in complex stories: the Stanford Emotional Narratives DatasetCode0
MIMAMO Net: Integrating Micro- and Macro-motion for Video Emotion RecognitionCode0
Emotion Recognition for Vietnamese Social Media Text0
Real-Time Emotion Recognition via Attention Gated Hierarchical Memory NetworkCode0
Take an Emotion Walk: Perceiving Emotions from Gaits Using Hierarchical Attention Pooling and Affective Mapping0
Joint Emotion Label Space Modelling for Affect Lexica0
Speech Emotion Recognition Using Speech Feature and Word EmbeddingCode0
Learning Relationships between Text, Audio, and Video via Deep Canonical Correlation for Multimodal Language Analysis0
Show:102550
← PrevPage 64 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified