SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 451475 of 2041 papers

TitleStatusHype
Robust EEG-based Emotion Recognition Using an Inception and Two-sided Perturbation Model0
MFHCA: Enhancing Speech Emotion Recognition Via Multi-Spatial Fusion and Hierarchical Cooperative Attention0
Multi-channel Emotion Analysis for Consensus Reaching in Group Movie Recommendation Systems0
Authentic Emotion Mapping: Benchmarking Facial Expressions in Real NewsCode0
Cooperative Sentiment Agents for Multimodal Sentiment AnalysisCode1
TRNet: Two-level Refinement Network leveraging Speech Enhancement for Noise Robust Speech Emotion Recognition0
Alleviating Catastrophic Forgetting in Facial Expression Recognition with Emotion-Centered Models0
Dynamic Modality and View Selection for Multimodal Emotion Recognition with Missing Modalities0
Context-Aware Siamese Networks for Efficient Emotion Recognition in Conversation0
Deep CNN with late fusion for realtime multimodal emotion recognition0
Joint Contrastive Learning with Feature Alignment for Cross-Corpus EEG-based Emotion Recognition0
Customising General Large Language Models for Specialised Emotion Recognition TasksCode0
MMA-DFER: MultiModal Adaptation of unimodal models for Dynamic Facial Expression Recognition in-the-wildCode2
Improving Personalisation in Valence and Arousal Prediction using Data Augmentation0
AIMDiT: Modality Augmentation and Interaction via Multimodal Dimension Transformation for Emotion Recognition in Conversations0
The Power of Properties: Uncovering the Influential Factors in Emotion Classification0
Resolve Domain Conflicts for Generalizable Remote Physiological MeasurementCode1
Multimodal Emotion Recognition by Fusing Video Semantic in MOOC Learning Scenarios0
What is Learnt by the LEArnable Front-end (LEAF)? Adapting Per-Channel Energy Normalisation (PCEN) to Noisy ConditionsCode0
nEMO: Dataset of Emotional Speech in PolishCode0
Improving Facial Landmark Detection Accuracy and Efficiency with Knowledge Distillation0
Dynamic Resolution Guidance for Facial Expression Recognition0
Facial Affective Behavior Analysis with Instruction TuningCode1
Towards Bi-Hemispheric Emotion Mapping through EEG: A Dual-Stream Neural Network Approach0
Music Recommendation Based on Facial Emotion Recognition0
Show:102550
← PrevPage 19 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified