SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 551575 of 2041 papers

TitleStatusHype
Emotion recognition in talking-face videos using persistent entropy and neural networksCode0
Emotion Recognition From Speech With Recurrent Neural NetworksCode0
Multi-Teacher Language-Aware Knowledge Distillation for Multilingual Speech Emotion RecognitionCode0
Emotion Recognition in the Wild using Deep Neural Networks and Bayesian ClassifiersCode0
EmoTxt: A Toolkit for Emotion Recognition from TextCode0
Enhancing Affective Representations of Music-Induced EEG through Multimodal Supervision and latent Domain AdaptationCode0
EmotionIC: emotional inertia and contagion-driven dependency modeling for emotion recognition in conversationCode0
Dialogue Quality and Emotion Annotations for Customer Support ConversationsCode0
Emotion Detection From Tweets Using a BERT and SVM Ensemble ModelCode0
Emotion Analysis of Writers and Readers of Japanese Tweets on VaccinationsCode0
DepecheMood++: a Bilingual Emotion Lexicon Built Through Simple Yet Powerful TechniquesCode0
Emotional Speech Recognition with Pre-trained Deep Visual ModelsCode0
M-MELD: A Multilingual Multi-Party Dataset for Emotion Recognition in ConversationsCode0
Emotion Analysis in NLP: Trends, Gaps and Roadmap for Future DirectionsCode0
Evaluating Gammatone Frequency Cepstral Coefficients with Neural Networks for Emotion Recognition from SpeechCode0
Multi-Task Learning Framework for Emotion Recognition in-the-wildCode0
Deformable Convolutional LSTM for Human Body Emotion RecognitionCode0
Complementary Fusion of Multi-Features and Multi-Modalities in Sentiment AnalysisCode0
A Compact Embedding for Facial Expression SimilarityCode0
Distilled Non-Semantic Speech Embeddings with Binary Neural Networks for Low-Resource DevicesCode0
EmojiHeroVR: A Study on Facial Expression Recognition under Partial Occlusion from Head-Mounted DisplaysCode0
Audio-Linguistic Embeddings for Spoken SentencesCode0
Emotion Action Detection and Emotion Inference: the Task and DatasetCode0
DiTMoS: Delving into Diverse Tiny-Model Selection on MicrocontrollersCode0
Emotion Recognition from SpeechCode0
Show:102550
← PrevPage 23 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified