SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 16511675 of 2041 papers

TitleStatusHype
Benchmarking Domain Generalization on EEG-based Emotion Recognition0
Benchmarking Multimodal Sentiment Analysis0
BERT-ERC: Fine-tuning BERT is Enough for Emotion Recognition in Conversation0
Best Practices for Noise-Based Augmentation to Improve the Performance of Deployable Speech-Based Emotion Recognition Systems0
Best Practices for Noise-Based Augmentation to Improve the Performance of Deployable Speech-Based Emotion Recognition Systems0
Better Spanish Emotion Recognition In-the-wild: Bringing Attention to Deep Spectrum Voice Analysis0
Beware of Overestimated Decoding Performance Arising from Temporal Autocorrelations in Electroencephalogram Signals0
Beyond Classification: Towards Speech Emotion Reasoning with Multitask AudioLLMs0
Beyond Isolated Utterances: Conversational Emotion Recognition0
Modeling Challenging Patient Interactions: LLMs for Medical Communication Training0
Bias and Fairness on Multimodal Emotion Detection Algorithms0
Eradicating Social Biases in Sentiment Analysis using Semantic Blinding and Semantic Propagation Graph Neural Networks0
Bias in Emotion Recognition with ChatGPT0
Bimodal Connection Attention Fusion for Speech Emotion Recognition0
Bimodal Speech Emotion Recognition Using Pre-Trained Language Models0
Biologically inspired speech emotion recognition0
Boosting Continuous Emotion Recognition with Self-Pretraining using Masked Autoencoders, Temporal Convolutional Networks, and Transformers0
Brain Computer Interface: Deep Learning Approach to Predict Human Emotion Recognition0
Where are We in Event-centric Emotion Analysis? Bridging Emotion Role Labeling and Appraisal-based Approaches0
Building a Multimodal Laughter Database for Emotion Recognition0
CAKE: Compact and Accurate K-dimensional representation of Emotion0
CALLM: Understanding Cancer Survivors' Emotions and Intervention Opportunities via Mobile Diaries and Context-Aware Language Models0
CALM: Contrastive Aligned Audio-Language Multirate and Multimodal Representations0
Camera-based implicit mind reading by capturing higher-order semantic dynamics of human gaze within environmental context0
CA-MHFA: A Context-Aware Multi-Head Factorized Attentive Pooling for SSL-Based Speaker Verification0
Show:102550
← PrevPage 67 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified