SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 125 of 2041 papers

TitleStatusHype
CosyVoice 3: Towards In-the-wild Speech Generation via Scaling-up and Post-trainingCode11
FunAudioLLM: Voice Understanding and Generation Foundation Models for Natural Interaction Between Humans and LLMsCode11
R1-Omni: Explainable Omni-Multimodal Emotion Recognition with Reinforcement LearningCode5
SZTU-CMU at MER2024: Improving Emotion-LLaMA with Conv-Attention for Multimodal Emotion RecognitionCode4
Emotion-LLaMA: Multimodal Emotion Recognition and Reasoning with Instruction TuningCode4
Large Brain Model for Learning Generic Representations with Tremendous EEG Data in BCICode4
OSUM: Advancing Open Speech Understanding Models with Limited Resources in AcademiaCode3
Addressing Emotion Bias in Music Emotion Recognition and Generation with Frechet Audio DistanceCode3
EmoBox: Multilingual Multi-corpus Speech Emotion Recognition Toolkit and BenchmarkCode3
MER 2024: Semi-Supervised Learning, Noise Robustness, and Open-Vocabulary Multimodal Emotion RecognitionCode3
emotion2vec: Self-Supervised Pre-Training for Speech Emotion RepresentationCode3
SALMONN: Towards Generic Hearing Abilities for Large Language ModelsCode3
EmoSphere-SER: Enhancing Speech Emotion Recognition Through Spherical Representation with Auxiliary ClassificationCode2
Why We Feel: Breaking Boundaries in Emotional Reasoning with Multimodal Large Language ModelsCode2
BRIGHTER: BRIdging the Gap in Human-Annotated Textual Emotion Recognition Datasets for 28 LanguagesCode2
A Survey of Personalized Large Language Models: Progress and Future DirectionsCode2
Dynamic-SUPERB Phase-2: A Collaboratively Expanding Benchmark for Measuring the Capabilities of Spoken Language Models with 180 TasksCode2
LibEER: A Comprehensive Benchmark and Algorithm Library for EEG-based Emotion RecognitionCode2
Hierarchical Hypercomplex Network for Multimodal Emotion RecognitionCode2
PHemoNet: A Multimodal Network for Physiological SignalsCode2
Recent Trends of Multimodal Affective Computing: A Survey from NLP PerspectiveCode2
MEEG and AT-DGNN: Improving EEG Emotion Recognition with Music Introducing and Graph-based LearningCode2
Multimodal Prompt Learning with Missing Modalities for Sentiment Analysis and Emotion RecognitionCode2
EmT: A Novel Transformer for Generalized Cross-subject EEG Emotion RecognitionCode2
Feature Fusion Based on Mutual-Cross-Attention Mechanism for EEG Emotion RecognitionCode2
Show:102550
← PrevPage 1 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified