SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 19762000 of 2041 papers

TitleStatusHype
Emotion Recognition in Low-Resource Settings: An Evaluation of Automatic Feature Selection Methods0
Emotion Recognition In Persian Speech Using Deep Neural Networks0
Emotion Recognition in Speech using Cross-Modal Transfer in the Wild0
Emotion Recognition System from Speech and Visual Information based on Convolutional Neural Networks0
Emotion recognition techniques with rule based and machine learning approaches0
Emotion Recognition under Consideration of the Emotion Component Process Model0
Emotion Recognition Using Convolutional Neural Network with Selected Statistical Photoplethysmogram Features0
Emotion Recognition Using Convolutional Neural Networks0
Emotion Recognition Using Fusion of Audio and Video Features0
Emotion Recognition using Machine Learning and ECG signals0
Emotion Recognition Using Speaker Cues0
Emotion Recognition Using Wearables: A Systematic Literature Review Work in progress0
Emotion recognition with 4kresolution database0
Emotion Recognition with CLIP and Sequential Learning0
Emotion Recognition with Facial Attention and Objective Activation Functions0
Emotion Recognition with Incomplete Labels Using Modified Multi-task Learning Technique0
Emotion Recognition with Machine Learning Using EEG Signals0
Emotion Recognition with Pre-Trained Transformers Using Multimodal Signals0
Emotion Recognition with Spatial Attention and Temporal Softmax Pooling0
Emotion Recognition With Temporarily Localized 'Emotional Events' in Naturalistic Context0
Emotions in the Loop: A Survey of Affective Computing for Emotional Support0
Emotion Stimulus Detection in German News Headlines0
EmotionX-AR: CNN-DCNN autoencoder based Emotion Classifier0
EmotionX-Area66: Predicting Emotions in Dialogues using Hierarchical Attention Network with Sequence Labeling0
Emotiphons: Emotion Markers in Conversational Speech - Comparison across Indian Languages0
Show:102550
← PrevPage 80 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified