SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 251300 of 2041 papers

TitleStatusHype
GA2MIF: Graph and Attention Based Two-Stage Multi-Source Information Fusion for Conversational Emotion DetectionCode1
Attribute Inference Attack of Speech Emotion Recognition in Federated Learning SettingsCode1
CLARA: Multilingual Contrastive Learning for Audio Representation AcquisitionCode1
FV2ES: A Fully End2End Multimodal System for Fast Yet Effective Video Emotion Recognition InferenceCode1
How you feelin'? Learning Emotions and Mental States in Movie ScenesCode1
HTNet for micro-expression recognitionCode1
HiGRU: Hierarchical Gated Recurrent Units for Utterance-level Emotion RecognitionCode1
Hierarchical Dialogue Understanding with Special Tokens and Turn-level AttentionCode1
HiQuE: Hierarchical Question Embedding Network for Multimodal Depression DetectionCode1
Hypercomplex Multimodal Emotion Recognition from EEG and Peripheral Physiological SignalsCode1
Graph Based Network with Contextualized Representations of Turns in DialogueCode1
GPT as Psychologist? Preliminary Evaluations for GPT-4V on Visual Affective ComputingCode1
GraphCFC: A Directed Graph Based Cross-Modal Feature Complementation Approach for Multimodal Conversational Emotion RecognitionCode1
GM-TCNet: Gated Multi-scale Temporal Convolutional Network using Emotion Causality for Speech Emotion RecognitionCode1
CARAT: Contrastive Feature Reconstruction and Aggregation for Multi-Modal Multi-Label Emotion RecognitionCode1
CAGE: Circumplex Affect Guided Expression InferenceCode1
How Deep Neural Networks Can Improve Emotion Recognition on Video DataCode1
CFN-ESA: A Cross-Modal Fusion Network with Emotion-Shift Awareness for Dialogue Emotion RecognitionCode1
Continuous Emotion Recognition with Audio-visual Leader-follower Attentive FusionCode1
ChatGPT: Jack of all trades, master of noneCode1
CMCRD: Cross-Modal Contrastive Representation Distillation for Emotion RecognitionCode1
Audio-Visual Fusion for Emotion Recognition in the Valence-Arousal Space Using Joint Cross-AttentionCode1
COGMEN: COntextualized GNN based Multimodal Emotion recognitioNCode1
Codified audio language modeling learns useful representations for music information retrievalCode1
CoMPM: Context Modeling with Speaker’s Pre-trained Memory Tracking for Emotion Recognition in ConversationCode1
Compact Graph Architecture for Speech Emotion RecognitionCode1
CoMPM: Context Modeling with Speaker's Pre-trained Memory Tracking for Emotion Recognition in ConversationCode1
Automated Parkinson's Disease Detection and Affective Analysis from Emotional EEG SignalsCode1
GPT-4V with Emotion: A Zero-shot Benchmark for Generalized Emotion RecognitionCode1
Is Cross-Attention Preferable to Self-Attention for Multi-Modal Emotion Recognition?Code1
Context De-confounded Emotion RecognitionCode1
Context Based Emotion Recognition using EMOTIC DatasetCode1
Group Gated Fusion on Attention-based Bidirectional Alignment for Multimodal Emotion RecognitionCode1
K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversationsCode1
Contextual Information and Commonsense Based Prompt for Emotion Recognition in ConversationCode1
A vector quantized masked autoencoder for speech emotion recognitionCode1
Cluster-Level Contrastive Learning for Emotion Recognition in ConversationsCode1
BHAAV- A Text Corpus for Emotion Analysis from Hindi StoriesCode1
Cooperative Sentiment Agents for Multimodal Sentiment AnalysisCode1
Tracing Intricate Cues in Dialogue: Joint Graph Structure and Sentiment Dynamics for Multimodal Emotion RecognitionCode1
BiosERC: Integrating Biography Speakers Supported by LLMs for ERC TasksCode1
Leveraging Label Correlations in a Multi-label Setting: A Case Study in EmotionCode1
DialogueGCN: A Graph Convolutional Neural Network for Emotion Recognition in ConversationCode1
Cross-Lingual Cross-Age Group Adaptation for Low-Resource Elderly Speech Emotion RecognitionCode1
HetEmotionNet: Two-Stream Heterogeneous Graph Recurrent Neural Network for Multi-modal Emotion RecognitionCode1
Cross Task Neural Architecture Search for EEG Signal ClassificationsCode1
IdentiFace : A VGG Based Multimodal Facial Biometric SystemCode1
EmoVerse: Exploring Multimodal Large Language Models for Sentiment and Emotion UnderstandingCode1
Learning and Evaluating Emotion Lexicons for 91 LanguagesCode1
MSE-Adapter: A Lightweight Plugin Endowing LLMs with the Capability to Perform Multimodal Sentiment Analysis and Emotion RecognitionCode1
Show:102550
← PrevPage 6 of 41Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified