SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 426450 of 2041 papers

TitleStatusHype
Sensing technologies and machine learning methods for emotion recognition in autism: Systematic review0
Contextual Emotion Recognition using Large Vision Language Models0
A Supervised Information Enhanced Multi-Granularity Contrastive Learning Framework for EEG Based Emotion RecognitionCode1
Self-supervised Gait-based Emotion Representation Learning from Selective Strongly Augmented Skeleton Sequences0
Empathy Through Multimodality in Conversational Interfaces0
Speaker Characterization by means of Attention Pooling0
Adapting WavLM for Speech Emotion Recognition0
Fine-grained Speech Sentiment Analysis in Chinese Psychological Support Hotlines Based on Large-scale Pre-trained ModelCode0
ESIHGNN: Event-State Interactions Infused Heterogeneous Graph Neural Network for Conversational Emotion Recognition0
GMP-TL: Gender-augmented Multi-scale Pseudo-label Enhanced Transfer Learning for Speech Emotion Recognition0
Toward end-to-end interpretable convolutional neural networks for waveform signals0
Converting Anyone's Voice: End-to-End Expressive Voice Conversion with a Conditional Diffusion Model0
EALD-MLLM: Emotion Analysis in Long-sequential and De-identity videos with Multi-modal Large Language Model0
Active Learning with Task Adaptation Pre-training for Speech Emotion RecognitionCode0
A Systematic Evaluation of Adversarial Attacks against Speech Emotion Recognition ModelsCode0
MultiMAE-DER: Multimodal Masked Autoencoder for Dynamic Emotion RecognitionCode1
Revisiting Multimodal Emotion Recognition in Conversation from the Perspective of Graph Spectrum0
Usefulness of Emotional Prosody in Neural Machine Translation0
Revisiting Multi-modal Emotion Learning with Broad State Space Models and Probability-guidance Fusion0
Two in One Go: Single-stage Emotion Recognition with Decoupled Subject-context Transformer0
MER 2024: Semi-Supervised Learning, Noise Robustness, and Open-Vocabulary Multimodal Emotion RecognitionCode3
Samsung Research China-Beijing at SemEval-2024 Task 3: A multi-stage framework for Emotion-Cause Pair Extraction in Conversations0
EmoVIT: Revolutionizing Emotion Insights with Visual Instruction TuningCode1
M3D: Manifold-based Domain Adaptation with Dynamic Distribution for Non-Deep Transfer Learning in Cross-subject and Cross-session EEG-based Emotion Recognition0
CAGE: Circumplex Affect Guided Expression InferenceCode1
Show:102550
← PrevPage 18 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified