SOTAVerified

Mamba

Papers

Showing 110 of 1058 papers

TitleStatusHype
Transformers are SSMs: Generalized Models and Efficient Algorithms Through Structured State Space DualityCode11
MambaOut: Do We Really Need Mamba for Vision?Code7
xLSTM 7B: A Recurrent LLM for Fast and Efficient InferenceCode7
VMamba: Visual State Space ModelCode7
MambaVision: A Hybrid Mamba-Transformer Vision BackboneCode7
Griffin: Mixing Gated Linear Recurrences with Local Attention for Efficient Language ModelsCode7
Mamba: Linear-Time Sequence Modeling with Selective State SpacesCode6
MambaIRv2: Attentive State Space RestorationCode5
MambaIR: A Simple Baseline for Image Restoration with State-Space ModelCode5
Jamba-1.5: Hybrid Transformer-Mamba Models at ScaleCode5
Show:102550
← PrevPage 1 of 106Next →

No leaderboard results yet.