SOTAVerified

Emotion Recognition

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Papers

Showing 76100 of 2041 papers

TitleStatusHype
SER Evals: In-domain and Out-of-domain Benchmarking for Speech Emotion RecognitionCode1
HiQuE: Hierarchical Question Embedding Network for Multimodal Depression DetectionCode1
Beyond Silent Letters: Amplifying LLMs in Emotion Recognition with Vocal NuancesCode1
Tracing Intricate Cues in Dialogue: Joint Graph Structure and Sentiment Dynamics for Multimodal Emotion RecognitionCode1
Multimodal Emotion Recognition using Audio-Video Transformer Fusion with Cross AttentionCode1
Norface: Improving Facial Expression Analysis by Identity NormalizationCode1
MDPE: A Multimodal Deception Dataset with Personality and Emotional CharacteristicsCode1
MMAD: Multi-label Micro-Action Detection in VideosCode1
BiosERC: Integrating Biography Speakers Supported by LLMs for ERC TasksCode1
Odyssey 2024 - Speech Emotion Recognition Challenge: Dataset, Baseline Framework, and ResultsCode1
SemEval-2024 Task 3: Multimodal Emotion Cause Analysis in ConversationsCode1
A Supervised Information Enhanced Multi-Granularity Contrastive Learning Framework for EEG Based Emotion RecognitionCode1
MultiMAE-DER: Multimodal Masked Autoencoder for Dynamic Emotion RecognitionCode1
EmoVIT: Revolutionizing Emotion Insights with Visual Instruction TuningCode1
CAGE: Circumplex Affect Guided Expression InferenceCode1
Cooperative Sentiment Agents for Multimodal Sentiment AnalysisCode1
Resolve Domain Conflicts for Generalizable Remote Physiological MeasurementCode1
Facial Affective Behavior Analysis with Instruction TuningCode1
MIPS at SemEval-2024 Task 3: Multimodal Emotion-Cause Pair Extraction in Conversations with Multimodal Language ModelsCode1
Emotion-Anchored Contrastive Learning Framework for Emotion Recognition in ConversationCode1
Accuracy enhancement method for speech emotion recognition from spectrogram using temporal frequency correlation and positional information learning through knowledge transferCode1
emoDARTS: Joint Optimisation of CNN & Sequential Neural Network Architectures for Superior Speech Emotion RecognitionCode1
Recursive Joint Cross-Modal Attention for Multimodal Fusion in Dimensional Emotion RecognitionCode1
Joint Multimodal Transformer for Emotion Recognition in the WildCode1
GPT as Psychologist? Preliminary Evaluations for GPT-4V on Visual Affective ComputingCode1
Show:102550
← PrevPage 4 of 82Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1M2D-CLAPEmoA77.4Unverified
2M2D2EmoA76.7Unverified
3M2DEmoA76.1Unverified
4Jukebox (Pre-training: CALM)EmoA72.1Unverified
5CLMR (Pre-training: contrastive)EmoA67.8Unverified
#ModelMetricClaimedVerifiedStatus
1LogisticRegression on posteriors of xlsr-Wav2Vec2.0&bi-LSTM+AttentionAccuracy86.7Unverified
2MultiMAE-DERWAR83.61Unverified
3Intermediate-Attention-FusionAccuracy81.58Unverified
4Logistic Regression on posteriors of the CNN-14&biLSTM-GuidedSTAccuracy80.08Unverified
5ERANN-0-4Accuracy74.8Unverified
#ModelMetricClaimedVerifiedStatus
1CAGETop-3 Accuracy (%)14.73Unverified
2FocusCLIPTop-3 Accuracy (%)13.73Unverified
#ModelMetricClaimedVerifiedStatus
1VGG based5-class test accuracy66.13Unverified
#ModelMetricClaimedVerifiedStatus
1MaSaC-ERC-ZF1-score (Weighted)51.17Unverified
#ModelMetricClaimedVerifiedStatus
1BiHDMAccuracy40.34Unverified
#ModelMetricClaimedVerifiedStatus
1w2v2-L-robust-12Concordance correlation coefficient (CCC)0.64Unverified
#ModelMetricClaimedVerifiedStatus
14D-aNNAccuracy96.1Unverified
#ModelMetricClaimedVerifiedStatus
1CNN1'"1Unverified