SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 501550 of 2041 papers

TitleStatusHype
Joint Multimodal Transformer for Emotion Recognition in the WildCode1
DiTMoS: Delving into Diverse Tiny-Model Selection on MicrocontrollersCode0
A Multimodal Fusion Network For Student Emotion Recognition Based on Transformer and Tensor Product0
CKERC : Joint Large Language Models with Commonsense Knowledge for Emotion Recognition in Conversation0
Group Movie Selection using Multi-channel Emotion Recognition0
Human Pose Descriptions and Subject-Focused Attention for Improved Zero-Shot Transfer in Human-Centric Classification Tasks0
Computational Analysis of Stress, Depression and Engagement in Mental Health: A Survey0
Robust Emotion Recognition in Context Debiasing0
GPT as Psychologist? Preliminary Evaluations for GPT-4V on Visual Affective ComputingCode1
Speech Emotion Recognition Via CNN-Transformer and Multidimensional Attention MechanismCode1
Cultural-Aware AI Model for Emotion RecognitionCode0
Cascaded Self-supervised Learning for Subject-independent EEG-based Emotion Recognition0
Angry Men, Sad Women: Large Language Models Reflect Gendered Stereotypes in Emotion AttributionCode0
EMOVOME: A Dataset for Emotion Recognition in Spontaneous Real-Life SpeechCode0
Emotion Analysis in NLP: Trends, Gaps and Roadmap for Future DirectionsCode0
Exploring the dynamic interplay of cognitive load and emotional arousal by using multimodal measurements: Correlation of pupil diameter and emotional arousal in emotionally engaging tasks0
SemEval 2024 -- Task 10: Emotion Discovery and Reasoning its Flip in Conversation (EDiReF)Code0
Curriculum Learning Meets Directed Acyclic Graph for Multimodal Emotion RecognitionCode1
Emotional Voice Messages (EMOVOME) database: emotion recognition in spontaneous voice messages0
EGNN-C+: Interpretable Evolving Granular Neural Network and Application in Classification of Weakly-Supervised EEG Data Streams0
ASEM: Enhancing Empathy in Chatbot through Attention-based Sentiment and Emotion ModelingCode0
GiMeFive: Towards Interpretable Facial Emotion ClassificationCode1
The AffectToolbox: Affect Analysis for Everyone0
Filter-based multi-task cross-corpus feature learning for speech emotion recognitionCode0
EMO-SUPERB: An In-depth Look at Speech Emotion RecognitionCode2
Parameter Efficient Finetuning for Speech Emotion Recognition and Domain Adaptation0
EmoBench: Evaluating the Emotional Intelligence of Large Language ModelsCode2
Ain't Misbehavin' -- Using LLMs to Generate Expressive Robot Behavior in Conversations with the Tabletop Robot Haru0
Personalized Large Language ModelsCode2
Multi-Modal Emotion Recognition by Text, Speech and Video Using Pretrained Transformers0
Persian Speech Emotion Recognition by Fine-Tuning Transformers0
CochCeps-Augment: A Novel Self-Supervised Contrastive Learning Using Cochlear Cepstrum-based Masking for Speech Emotion RecognitionCode0
Evaluation Metrics for Automated Typographic Poster GenerationCode0
English Prompts are Better for NLI-based Zero-Shot Emotion Classification than Target-Language Prompts0
Layer-Wise Analysis of Self-Supervised Acoustic Word Embeddings: A Study on Speech Emotion Recognition0
Graph Neural Networks in EEG-based Emotion Recognition: A Survey0
STAA-Net: A Sparse and Transferable Adversarial Attack for Speech Emotion Recognition0
Are Paralinguistic Representations all that is needed for Speech Emotion Recognition?0
FindingEmo: An Image Dataset for Emotion Recognition in the Wild0
LRDif: Diffusion Models for Under-Display Camera Emotion Recognition0
Neuromorphic Valence and Arousal Estimation0
Real-time EEG-based Emotion Recognition Model using Principal Component Analysis and Tree-based Models for NeurohumanitiesCode0
AMuSE: Adaptive Multimodal Analysis for Speaker Emotion Recognition in Group Conversations0
MF-AED-AEC: Speech Emotion Recognition by Leveraging Multimodal Fusion, Asr Error Detection, and Asr Error Correction0
Density Adaptive Attention is All You Need: Robust Parameter-Efficient Fine-Tuning Across Multiple ModalitiesCode1
Revealing Emotional Clusters in Speaker Embeddings: A Contrastive Learning Strategy for Speech Emotion Recognition0
Speech Swin-Transformer: Exploring a Hierarchical Transformer with Shifted Windows for Speech Emotion Recognition0
Self context-aware emotion perception on human-robot interaction0
Improving Speaker-independent Speech Emotion Recognition Using Dynamic Joint Distribution Adaptation0
TelME: Teacher-leading Multimodal Fusion Network for Emotion Recognition in ConversationCode1
Show:102550
← PrevPage 11 of 41Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified