SOTAVerified

Solving Attention Kernel Regression Problem via Pre-conditioner

2023-08-28Unverified0· sign in to hype

Zhao Song, Junze Yin, Lichen Zhang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The attention mechanism is the key to large language models, and the attention matrix serves as an algorithmic and computational bottleneck for such a scheme. In this paper, we define two problems, motivated by designing fast algorithms for proxy of attention matrix and solving regressions against them. Given an input matrix A R^n d with n d and a response vector b, we first consider the matrix exponential of the matrix A^ A as a proxy, and we in turn design algorithms for two types of regression problems: _x R^d\|(A^ A)^jx-b\|_2 and _x R^d\|A(A^ A)^jx-b\|_2 for any positive integer j. Studying algorithms for these regressions is essential, as matrix exponential can be approximated term-by-term via these smaller problems. The second proxy is applying exponential entrywise to the Gram matrix, denoted by (AA^) and solving the regression _x R^n\|(AA^)x-b \|_2. We call this problem the attention kernel regression problem, as the matrix (AA^) could be viewed as a kernel function with respect to A. We design fast algorithms for these regression problems, based on sketching and preconditioning. We hope these efforts will provide an alternative perspective of studying efficient approximation of attention matrices.

Tasks

Reproductions