SOTAVerified

Long-range modeling

A new task for testing the long-sequence modeling capabilities and efficiency of language models.

Image credit: SCROLLS: Standardized CompaRison Over Long Language Sequences

Papers

Showing 5160 of 95 papers

TitleStatusHype
Sparse Modular Activation for Efficient Sequence ModelingCode1
The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon TasksCode1
Primal-Attention: Self-attention through Asymmetric Kernel SVD in Primal RepresentationCode1
Fourier Transformer: Fast Long Range Modeling by Removing Sequence Redundancy with FFT OperatorCode1
Focus Your Attention (with Adaptive IIR Filters)0
T-former: An Efficient Transformer for Image InpaintingCode1
A General-Purpose Multilingual Document EncoderCode0
RFR-WWANet: Weighted Window Attention-Based Recovery Feature Resolution Network for Unsupervised Image RegistrationCode0
HST-MRF: Heterogeneous Swin Transformer with Multi-Receptive Field for Medical Image Segmentation0
CoLT5: Faster Long-Range Transformers with Conditional Computation0
Show:102550
← PrevPage 6 of 10Next →

No leaderboard results yet.