SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 801825 of 2041 papers

TitleStatusHype
Construction of English-French Multimodal Affective Conversational Corpus from TV Dramas0
EmoWOZ: A Large-Scale Corpus and Labelling Scheme for Emotion Recognition in Task-Oriented Dialogue Systems0
Empathetic Conversational Agents: Utilizing Neural and Physiological Signals for Enhanced Empathetic Interactions0
A Comparison Of Emotion Annotation Schemes And A New Annotated Data Set0
Empathy and Distress Prediction using Transformer Multi-output Regression and Emotion Analysis with an Ensemble of Supervised and Zero-Shot Learning Models0
Empathy Through Multimodality in Conversational Interfaces0
Empirical Analysis of Asynchronous Federated Learning on Heterogeneous Devices: Efficiency, Fairness, and Privacy Trade-offs0
Empirical Interpretation of Speech Emotion Perception with Attention Based Model for Speech Emotion Recognition0
Empirical Interpretation of the Relationship Between Speech Acoustic Context and Emotion Recognition0
Empowering Dysarthric Speech: Leveraging Advanced LLMs for Accurate Speech Correction and Multimodal Emotion Analysis0
ECG-EmotionNet: Nested Mixture of Expert (NMoE) Adaptation of ECG-Foundation Model for Driver Emotion Recognition0
Modeling Challenging Patient Interactions: LLMs for Medical Communication Training0
Enabling Deep Learning of Emotion With First-Person Seed Expressions0
End-to-End Continuous Speech Emotion Recognition in Real-life Customer Service Call Center Conversations0
End-to-End Emotional Speech Synthesis Using Style Tokens and Semi-Supervised Training0
End-to-end facial and physiological model for Affective Computing and applications0
Early Joint Learning of Emotion Information Makes MultiModal Model Understand You Better0
A Real Time Facial Expression Classification System Using Local Binary Patterns0
EALD-MLLM: Emotion Analysis in Long-sequential and De-identity videos with Multi-modal Large Language Model0
Beyond Isolated Utterances: Conversational Emotion Recognition0
End-to-end transfer learning for speaker-independent cross-language and cross-corpus speech emotion recognition0
An Audio-Video Deep and Transfer Learning Framework for Multimodal Emotion Recognition in the wild0
English Prompts are Better for NLI-based Zero-Shot Emotion Classification than Target-Language Prompts0
Are Large Language Models More Empathetic than Humans?0
標記對於類神經語音情緒辨識系統辨識效果之影響(Effects of Label in Neural Speech Emotion Recognition System)[In Chinese]0
Show:102550
← PrevPage 33 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified