SOTAVerified

State Space Models

Papers

Showing 110 of 923 papers

TitleStatusHype
Transformers are SSMs: Generalized Models and Efficient Algorithms Through Structured State Space DualityCode11
ThunderKittens: Simple, Fast, and Adorable AI KernelsCode7
xLSTM: Extended Long Short-Term MemoryCode7
Mamba: Linear-Time Sequence Modeling with Selective State SpacesCode6
Awesome Multi-modal Object TrackingCode5
Gated Attention for Large Language Models: Non-linearity, Sparsity, and Attention-Sink-FreeCode4
Samba: Simple Hybrid State Space Models for Efficient Unlimited Context Language ModelingCode4
Mamba YOLO: A Simple Baseline for Object Detection with State Space ModelCode4
A Survey on Visual MambaCode4
Megalodon: Efficient LLM Pretraining and Inference with Unlimited Context LengthCode4
Show:102550
← PrevPage 1 of 93Next →

No leaderboard results yet.