SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 726750 of 2041 papers

TitleStatusHype
A Facial Expression-Aware Multimodal Multi-task Learning Framework for Emotion Recognition in Multi-party ConversationsCode1
Empirical Interpretation of the Relationship Between Speech Acoustic Context and Emotion Recognition0
EmoSpeech: Guiding FastSpeech2 Towards Emotional Text to SpeechCode1
Emotion Analysis of Tweets Banning Education in Afghanistan0
Exploiting Pseudo Future Contexts for Emotion Recognition in ConversationsCode0
Cross-Lingual Cross-Age Group Adaptation for Low-Resource Elderly Speech Emotion RecognitionCode1
Cross-Language Speech Emotion Recognition Using Multimodal Dual Attention Transformers0
TACOformer:Token-channel compounded Cross Attention for Multimodal Emotion Recognition0
Speech Emotion Diarization: Which Emotion Appears When?Code1
A Comparison of Time-based Models for Multimodal Emotion Recognition0
A Low-rank Matching Attention based Cross-modal Feature Fusion Method for Conversational Emotion Recognition0
FedMultimodal: A Benchmark For Multimodal Federated LearningCode0
SAFER: Situation Aware Facial Emotion Recognition0
Continuous Learning Based Novelty Aware Emotion Recognition System0
EMERSK -- Explainable Multimodal Emotion Recognition with Situational Knowledge0
GEmo-CLAP: Gender-Attribute-Enhanced Contrastive Language-Audio Pretraining for Accurate Speech Emotion Recognition0
MFSN: Multi-perspective Fusion Search Network For Pre-training Knowledge in Speech Emotion Recognition0
Exploring Attention Mechanisms for Multimodal Emotion Recognition in an Emergency Call Center Corpus0
Mimicking the Thinking Process for Emotion Recognition in Conversation with Prompts and ParaphrasingCode0
Estimating the Uncertainty in Emotion Attributes using Deep Evidential RegressionCode1
TS-MoCo: Time-Series Momentum Contrast for Self-Supervised Physiological Representation LearningCode1
Modality Influence in Multimodal Machine Learning0
Learning Emotional Representations from Imbalanced Speech Data for Speech Emotion Recognition and Emotional Text-to-Speech0
A Quantum Probability Driven Framework for Joint Multi-Modal Sarcasm, Sentiment and Emotion Analysis0
Synthesizing Affective Neurophysiological Signals Using Generative Models: A Review Paper0
Show:102550
← PrevPage 30 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified