Multi-Source EEG Emotion Recognition via Dynamic Contrastive Domain Adaptation
Yun Xiao, Yimeng Zhang, Xiaopeng Peng, Shuzheng Han, Xia Zheng, Dingyi Fang, Xiaojiang Chen
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Electroencephalography (EEG) provides reliable indications of human cognition and mental states. Accurate emotion recognition from EEG remains challenging due to signal variations among individuals and across measurement sessions. We introduce a multi-source dynamic contrastive domain adaptation method (MS-DCDA) based on differential entropy (DE) features, in which coarse-grained inter-domain and fine-grained intra-class adaptations are modeled through a multi-branch contrastive neural network and contrastive sub-domain discrepancy learning. Leveraging domain knowledge from each individual source and a complementary source ensemble, our model uses dynamically weighted learning to achieve an optimal tradeoff between domain transferability and discriminability. The proposed MS-DCDA model was evaluated using the SEED and SEED-IV datasets, achieving respectively the highest mean accuracies of 90.84\% and 78.49\% in cross-subject experiments as well as 95.82\% and 82.25\% in cross-session experiments. Our model outperforms several alternative domain adaptation methods in recognition accuracy, inter-class margin, and intra-class compactness. Our study also suggests greater emotional sensitivity in the frontal and parietal brain lobes, providing insights for mental health interventions, personalized medicine, and preventive strategies.