SOTAVerified

Aligning Brain Activity with Advanced Transformer Models: Exploring the Role of Punctuation in Semantic Processing

2025-01-10Code Available0· sign in to hype

Zenon Lamprou, Frank Polick, Yashar Moshfeghi

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

This research examines the congruence between neural activity and advanced transformer models, emphasizing the semantic significance of punctuation in text understanding. Utilizing an innovative approach originally proposed by Toneva and Wehbe, we evaluate four advanced transformer models RoBERTa, DistiliBERT, ALBERT, and ELECTRA against neural activity data. Our findings indicate that RoBERTa exhibits the closest alignment with neural activity, surpassing BERT in accuracy. Furthermore, we investigate the impact of punctuation removal on model performance and neural alignment, revealing that BERT's accuracy enhances in the absence of punctuation. This study contributes to the comprehension of how neural networks represent language and the influence of punctuation on semantic processing within the human brain.

Tasks

Reproductions