SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 201250 of 2041 papers

TitleStatusHype
Enhancing Speech Emotion Recognition Through Differentiable Architecture SearchCode1
Arabic Speech Emotion Recognition Employing Wav2vec2.0 and HuBERT Based on BAVED DatasetCode1
Multimodal Routing: Improving Local and Global Interpretability of Multimodal Language AnalysisCode1
Interpretable SincNet-based Deep Learning for Emotion Recognition from EEG brain activityCode1
Is Cross-Attention Preferable to Self-Attention for Multi-Modal Emotion Recognition?Code1
Jointly Fine-Tuning “BERT-like” Self Supervised Models to Improve Multimodal Speech Emotion RecognitionCode1
CoMPM: Context Modeling with Speaker’s Pre-trained Memory Tracking for Emotion Recognition in ConversationCode1
CoMPM: Context Modeling with Speaker's Pre-trained Memory Tracking for Emotion Recognition in ConversationCode1
Latent Distribution Decoupling: A Probabilistic Framework for Uncertainty-Aware Multimodal Emotion RecognitionCode1
Context Based Emotion Recognition using EMOTIC DatasetCode1
Contrast and Generation Make BART a Good Dialogue Emotion RecognizerCode1
Learning Emotion Representations from Verbal and Nonverbal CommunicationCode1
A Joint Cross-Attention Model for Audio-Visual Fusion in Dimensional Emotion RecognitionCode1
Leveraging Label Correlations in a Multi-label Setting: A Case Study in EmotionCode1
Automated Parkinson's Disease Detection and Affective Analysis from Emotional EEG SignalsCode1
LSSED: a large-scale dataset and benchmark for speech emotion recognitionCode1
M3ED: Multi-modal Multi-scene Multi-label Emotional Dialogue DatabaseCode1
Mamba-Enhanced Text-Audio-Video Alignment Network for Emotion Recognition in ConversationsCode1
Cross-Lingual Cross-Age Group Adaptation for Low-Resource Elderly Speech Emotion RecognitionCode1
CMCRD: Cross-Modal Contrastive Representation Distillation for Emotion RecognitionCode1
Cluster-Level Contrastive Learning for Emotion Recognition in ConversationsCode1
Milmer: a Framework for Multiple Instance Learning based Multimodal Emotion RecognitionCode1
Codified audio language modeling learns useful representations for music information retrievalCode1
MM-DFN: Multimodal Dynamic Fusion Network for Emotion Recognition in ConversationsCode1
MMGCN: Multimodal Fusion via Deep Graph Convolution Network for Emotion Recognition in ConversationCode1
Modality-Transferable Emotion Embeddings for Low-Resource Multimodal Emotion RecognitionCode1
MSAF: Multimodal Split Attention FusionCode1
A Supervised Information Enhanced Multi-Granularity Contrastive Learning Framework for EEG Based Emotion RecognitionCode1
MS-MDA: Multisource Marginal Distribution Adaptation for Cross-subject and Cross-session EEG Emotion RecognitionCode1
ChatGPT: Jack of all trades, master of noneCode1
Beyond Silent Letters: Amplifying LLMs in Emotion Recognition with Vocal NuancesCode1
MultiMAE-DER: Multimodal Masked Autoencoder for Dynamic Emotion RecognitionCode1
Multimodal Emotion Recognition with Transformer-Based Self Supervised Feature FusionCode1
Multimodal Emotion Recognition with High-level Speech and Text FeaturesCode1
Multimodal Information Bottleneck: Learning Minimal Sufficient Unimodal and Multimodal RepresentationsCode1
CFN-ESA: A Cross-Modal Fusion Network with Emotion-Shift Awareness for Dialogue Emotion RecognitionCode1
Multivariate, Multi-Frequency and Multimodal: Rethinking Graph Neural Networks for Emotion Recognition in ConversationCode1
MuSe 2020 -- The First International Multimodal Sentiment Analysis in Real-life Media Challenge and WorkshopCode1
CLARA: Multilingual Contrastive Learning for Audio Representation AcquisitionCode1
COGMEN: COntextualized GNN based Multimodal Emotion recognitioNCode1
PARSE: Pairwise Alignment of Representations in Semi-Supervised EEG Learning for Emotion RecognitionCode1
A Transformer-based joint-encoding for Emotion Recognition and Sentiment AnalysisCode1
CAGE: Circumplex Affect Guided Expression InferenceCode1
Personalized Dynamic Music Emotion Recognition with Dual-Scale Attention-Based Meta-LearningCode1
Perspective-taking and Pragmatics for Generating Empathetic Responses Focused on Emotion CausesCode1
BHAAV- A Text Corpus for Emotion Analysis from Hindi StoriesCode1
Tracing Intricate Cues in Dialogue: Joint Graph Structure and Sentiment Dynamics for Multimodal Emotion RecognitionCode1
Pre-trained Deep Convolution Neural Network Model With Attention for Speech Emotion RecognitionCode1
A Multimodal Corpus for Emotion Recognition in SarcasmCode1
BiosERC: Integrating Biography Speakers Supported by LLMs for ERC TasksCode1
Show:102550
← PrevPage 5 of 41Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified