| Comparison of different Unique hard attention transformer models by the formal languages they can recognize | Jun 3, 2025 | Hard AttentionSurvey | —Unverified | 0 |
| Characterizing the Expressivity of Transformer Language Models | May 29, 2025 | Hard Attention | —Unverified | 0 |
| Exact Expressive Power of Transformers with Padding | May 25, 2025 | Hard Attention | —Unverified | 0 |
| Emergence of Fixational and Saccadic Movements in a Multi-Level Recurrent Attention Model for Vision | May 19, 2025 | Hard Attentionimage-classification | —Unverified | 0 |
| NoPE: The Counting Power of Transformers with No Positional Encodings | May 16, 2025 | Hard Attention | —Unverified | 0 |
| Neuroevolution of Self-Attention Over Proto-Objects | Apr 30, 2025 | Hard AttentionImage Segmentation | —Unverified | 0 |
| AttentionDrop: A Novel Regularization Method for Transformer Models | Apr 16, 2025 | Hard Attention | —Unverified | 0 |
| Center-guided Classifier for Semantic Segmentation of Remote Sensing Images | Mar 21, 2025 | Hard AttentionSegmentation | CodeCode Available | 0 |
| Unique Hard Attention: A Tale of Two Sides | Mar 18, 2025 | Hard Attention | —Unverified | 0 |
| Lower Bounds for Chain-of-Thought Reasoning in Hard-Attention Transformers | Feb 4, 2025 | Hard Attention | —Unverified | 0 |