SOTAVerified

More Effective LLM Compressed Tokens with Uniformly Spread Position Identifiers and Compression Loss

2024-09-22Unverified0· sign in to hype

Runsong Zhao, Pengcheng Huang, Xinyu Liu, Chunyang Xiao, Tong Xiao, Jingbo Zhu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Compressing Transformer inputs into compressd tokens allows running LLMs with improved speed and cost efficiency. Based on the compression method ICAE, we carefully examine the position identifier choices for compressed tokens and also propose a new compression loss. We demonstrate empirically that our proposed methods achieve significantly higher compression ratios (15x compared to 4x for ICAE), while being able to attain comparable reconstruction performance.

Tasks

Reproductions