SOTAVerified

Long-range modeling

A new task for testing the long-sequence modeling capabilities and efficiency of language models.

Image credit: SCROLLS: Standardized CompaRison Over Long Language Sequences

Papers

Showing 3140 of 95 papers

TitleStatusHype
ViTEraser: Harnessing the Power of Vision Transformers for Scene Text Removal with SegMIM PretrainingCode1
Sparse Modular Activation for Efficient Sequence ModelingCode1
The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon TasksCode1
Primal-Attention: Self-attention through Asymmetric Kernel SVD in Primal RepresentationCode1
Fourier Transformer: Fast Long Range Modeling by Removing Sequence Redundancy with FFT OperatorCode1
T-former: An Efficient Transformer for Image InpaintingCode1
What Makes Convolutional Models Great on Long Sequence Modeling?Code1
CAB: Comprehensive Attention Benchmarking on Long Sequence ModelingCode1
Multi-scale Attention Network for Single Image Super-ResolutionCode1
Adapting Pretrained Text-to-Text Models for Long Text SequencesCode1
Show:102550
← PrevPage 4 of 10Next →

No leaderboard results yet.