SOTAVerified

Book summarization

Papers

Showing 110 of 13 papers

TitleStatusHype
Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attentionCode4
Unlimiformer: Long-Range Transformers with Unlimited Length InputCode3
KV Cache Compression, But What Must We Give in Return? A Comprehensive Benchmark of Long Context Capable ApproachesCode2
Attention Score is not All You Need for Token Importance Indicator in KV Cache Reduction: Value Also MattersCode1
Cache Me If You Can: How Many KVs Do You Need for Effective Long-Context LMs?Code1
LOCOST: State-Space Models for Long Document Abstractive SummarizationCode1
Enhancing Large Language Model with Self-Controlled Memory FrameworkCode1
Echoes from Alexandria: A Large Resource for Multilingual Book SummarizationCode1
New Alignment Methods for Discriminative Book Summarization0
Is It Really Long Context if All You Need Is Retrieval? Towards Genuinely Difficult Long Context NLP0
Show:102550
← PrevPage 1 of 2Next →

No leaderboard results yet.