SOTAVerified

Position Interpolation Improves ALiBi Extrapolation

2023-10-18Code Available2· sign in to hype

Faisal Al-Khateeb, Nolan Dey, Daria Soboleva, Joel Hestness

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Linear position interpolation helps pre-trained models using rotary position embeddings (RoPE) to extrapolate to longer sequence lengths. We propose using linear position interpolation to extend the extrapolation range of models using Attention with Linear Biases (ALiBi). We find position interpolation significantly improves extrapolation capability on upstream language modelling and downstream summarization and retrieval tasks.

Tasks

Reproductions