SOTAVerified

Hard Attention

Papers

Showing 150 of 100 papers

TitleStatusHype
AMR Parsing with Action-Pointer TransformerCode1
Coherent Concept-based Explanations in Medical Image and Its Application to Skin Lesion DiagnosisCode1
Table Retrieval May Not Necessitate Table-specific Model DesignCode1
Exact Hard Monotonic Attention for Character-Level TransductionCode1
Learning Texture Transformer Network for Image Super-ResolutionCode1
A Hybrid Attention Mechanism for Weakly-Supervised Temporal Action LocalizationCode1
FANet: A Feedback Attention Network for Improved Biomedical Image SegmentationCode1
Recurrent Models of Visual AttentionCode1
Investigation of Architectures and Receptive Fields for Appearance-based Gaze EstimationCode1
Mutual Distillation Learning For Person Re-IdentificationCode1
Hard Non-Monotonic Attention for Character-Level TransductionCode1
Hard-Attention Gates with Gradient Routing for Endoscopic Image ComputingCode1
Self-Attention Networks Can Process Bounded Hierarchical LanguagesCode1
Hard-Attention for Scalable Image ClassificationCode1
Deep Pneumonia: Attention-Based Contrastive Learning for Class-Imbalanced Pneumonia Lesion Recognition in Chest X-rays0
Effect of choice of probability distribution, randomness, and search methods for alignment modeling in sequence-to-sequence text-to-speech synthesis using hard alignment0
Ehrenfeucht-Haussler Rank and Chain of Thought0
Emergence of Fixational and Saccadic Movements in a Multi-Level Recurrent Attention Model for Vision0
Exact Expressive Power of Transformers with Padding0
Language-Guided Reinforcement Learning for Hard Attention in Few-Shot Learning0
Extractive Adversarial Networks: High-Recall Explanations for Identifying Personal Attacks in Social Media Posts0
Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity0
Generative Adversarial Networks Based on Collaborative Learning and Attention Mechanism for Hyperspectral Image Classification0
GQHAN: A Grover-inspired Quantum Hard Attention Network0
Graph Decoupling Attention Markov Networks for Semi-supervised Graph Node Classification0
Hard Attention Control By Mutual Information Maximization0
You Only Need One Model for Open-domain Question Answering0
NoPE: The Counting Power of Transformers with No Positional Encodings0
Achieving Explainability in a Visual Hard Attention Model through Content Prediction0
A Differentiable Self-disambiguated Sense Embedding Model via Scaled Gumbel Softmax0
AMR Parsing with Action-Pointer Transformer0
An Exploration of Neural Sequence-to-Sequence Architectures for Automatic Post-Editing0
A study of latent monotonic attention variants0
AttentionDrop: A Novel Regularization Method for Transformer Models0
Average-Hard Attention Transformers are Constant-Depth Uniform Threshold Circuits0
Characterizing the Expressivity of Transformer Language Models0
CLAWS: Contrastive Learning with hard Attention and Weak Supervision0
Comparison of different Unique hard attention transformer models by the formal languages they can recognize0
Continual Diffusion with STAMINA: STack-And-Mask INcremental Adapters0
DanHAR: Dual Attention Network For Multimodal Human Activity Recognition Using Wearable Sensors0
Look Harder: A Neural Machine Translation Model with Hard Attention0
Lower Bounds for Chain-of-Thought Reasoning in Hard-Attention Transformers0
Masked Hard-Attention Transformers Recognize Exactly the Star-Free Languages0
MESAHA-Net: Multi-Encoders based Self-Adaptive Hard Attention Network with Maximum Intensity Projections for Lung Nodule Segmentation in CT Scan0
Dual Attention Model with Reinforcement Learning for Classification of Histology Whole-Slide Images0
Multimodal Emergent Fake News Detection via Meta Neural Process Networks0
MultiResolution Attention Extractor for Small Object Detection0
Multi-View Unsupervised Image Generation with Cross Attention Guidance0
Near-Optimal Glimpse Sequences for Improved Hard Attention Neural Network Training0
Near-Optimal Glimpse Sequences for Training Hard Attention Neural Networks0
Show:102550
← PrevPage 1 of 2Next →

No leaderboard results yet.