SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 11511175 of 2041 papers

TitleStatusHype
SAFER: Situation Aware Facial Emotion Recognition0
Samsung Research China-Beijing at SemEval-2024 Task 3: A multi-stage framework for Emotion-Cause Pair Extraction in Conversations0
Sarcasm in Sight and Sound: Benchmarking and Expansion to Improve Multimodal Sarcasm Detection0
ScalingNet: extracting features from raw EEG data for emotion recognition0
Seamless Multimodal Biometrics for Continuous Personalised Wellbeing Monitoring0
Searching for Effective Preprocessing Method and CNN-based Architecture with Efficient Channel Attention on Speech Emotion Recognition0
Seeking Subjectivity in Visual Emotion Distribution Learning0
Selective Co-occurrences for Word-Emotion Association0
Self context-aware emotion perception on human-robot interaction0
Self-Supervised Attention Networks and Uncertainty Loss Weighting for Multi-Task Emotion Recognition on Vocal Bursts0
Self-supervised Auxiliary Learning for Texture and Model-based Hybrid Robust and Fair Featuring in Face Analysis0
Self-supervised Gait-based Emotion Representation Learning from Selective Strongly Augmented Skeleton Sequences0
Self-Supervised Learning for Audio-Based Emotion Recognition0
Self-Supervised learning with cross-modal transformers for emotion recognition0
Self-supervised representations in speech-based depression detection0
SeLiNet: Sentiment enriched Lightweight Network for Emotion Recognition in Images0
Semantic Role Labeling of Emotions in Tweets0
SemEval 2018 Task 2: Multilingual Emoji Prediction0
SemEval-2020 Task 8: Memotion Analysis- the Visuo-Lingual Metaphor!0
Semi-supervised Bayesian Deep Multi-modal Emotion Recognition0
Semi-supervised cross-lingual speech emotion recognition0
Semi-supervised Deep Generative Modelling of Incomplete Multi-Modality Emotional Data0
Semi-Supervised Dual-Stream Self-Attentive Adversarial Graph Contrastive Learning for Cross-Subject EEG-based Emotion Recognition0
Semi-Supervised Self-Learning Enhanced Music Emotion Recognition0
SensEmo: Enabling Affective Learning through Real-time Emotion Recognition with Smartwatches0
Show:102550
← PrevPage 47 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified