SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 126150 of 2041 papers

TitleStatusHype
A novel Fourier Adjacency Transformer for advanced EEG emotion recognitionCode1
Steering Language Model to Stable Speech Emotion Recognition via Contextual Perception and Chain of ThoughtCode1
Teleology-Driven Affective Computing: A Causal Framework for Sustained Well-Being0
Latent Distribution Decoupling: A Probabilistic Framework for Uncertainty-Aware Multimodal Emotion RecognitionCode1
MSE-Adapter: A Lightweight Plugin Endowing LLMs with the Capability to Perform Multimodal Sentiment Analysis and Emotion RecognitionCode1
A Survey of Personalized Large Language Models: Progress and Future DirectionsCode2
BRIGHTER: BRIdging the Gap in Human-Annotated Textual Emotion Recognition Datasets for 28 LanguagesCode2
Akan Cinematic Emotions (ACE): A Multimodal Multi-party Dataset for Emotion Recognition in Movie Dialogues0
Interpretable Concept-based Deep Learning Framework for Multimodal Human Behavior Modeling0
A Novel Dialect-Aware Framework for the Classification of Arabic Dialects and Emotions0
A Novel Approach to for Multimodal Emotion Recognition : Multimodal semantic information fusion0
Enhancing Higher Education with Generative AI: A Multimodal Approach for Personalised Learning0
RAMer: Reconstruction-based Adversarial Model for Multi-party Multi-modal Multi-label Emotion RecognitionCode0
EmoBench-M: Benchmarking Emotional Intelligence for Multimodal Large Language Models0
Towards Unified Music Emotion Recognition across Dimensional and Categorical ModelsCode1
Emotion Recognition and Generation: A Comprehensive Review of Face, Speech, and Text Modalities0
SigWavNet: Learning Multiresolution Signal Wavelet Network for Speech Emotion RecognitionCode1
Milmer: a Framework for Multiple Instance Learning based Multimodal Emotion RecognitionCode1
Mini-ResEmoteNet: Leveraging Knowledge Distillation for Human-Centered Design0
Divergent Emotional Patterns in Disinformation on Social Media? An Analysis of Tweets and TikToks about the DANA in Valencia0
Linguistic Analysis of Sinhala YouTube Comments on Sinhala Music Videos: A Dataset Study0
Multimodal Magic Elevating Depression Detection with a Fusion of Text and Audio Intelligence0
Fuzzy-aware Loss for Source-free Domain Adaptation in Visual Emotion Recognition0
Cross-modal Context Fusion and Adaptive Graph Convolutional Network for Multimodal Conversational Emotion Recognition0
HumanOmni: A Large Vision-Speech Language Model for Human-Centric Video Understanding0
Show:102550
← PrevPage 6 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified