SOTAVerified

Hard Attention

Papers

Showing 5175 of 100 papers

TitleStatusHype
Specialized Transformers: Faster, Smaller and more Accurate NLP Models0
Text as Environment: A Deep Reinforcement Learning Text Readability Assessment Model0
Theoretical Limitations of Self-Attention in Neural Sequence Models0
Transformers as Transducers0
Transformers in Uniform TC^00
Unique Hard Attention: A Tale of Two Sides0
Upper, Middle and Lower Region Learning for Facial Action Unit Detection0
Video Violence Recognition and Localization Using a Semi-Supervised Hard Attention Model0
Word Representation Models for Morphologically Rich Languages in Neural Machine Translation0
You Only Need One Model for Open-domain Question Answering0
NoPE: The Counting Power of Transformers with No Positional Encodings0
Achieving Explainability in a Visual Hard Attention Model through Content Prediction0
A Differentiable Self-disambiguated Sense Embedding Model via Scaled Gumbel Softmax0
AMR Parsing with Action-Pointer Transformer0
An Exploration of Neural Sequence-to-Sequence Architectures for Automatic Post-Editing0
A study of latent monotonic attention variants0
AttentionDrop: A Novel Regularization Method for Transformer Models0
Average-Hard Attention Transformers are Constant-Depth Uniform Threshold Circuits0
Characterizing the Expressivity of Transformer Language Models0
CLAWS: Contrastive Learning with hard Attention and Weak Supervision0
Comparison of different Unique hard attention transformer models by the formal languages they can recognize0
Continual Diffusion with STAMINA: STack-And-Mask INcremental Adapters0
DanHAR: Dual Attention Network For Multimodal Human Activity Recognition Using Wearable Sensors0
Deep Pneumonia: Attention-Based Contrastive Learning for Class-Imbalanced Pneumonia Lesion Recognition in Chest X-rays0
Effect of choice of probability distribution, randomness, and search methods for alignment modeling in sequence-to-sequence text-to-speech synthesis using hard alignment0
Show:102550
← PrevPage 3 of 4Next →

No leaderboard results yet.