SOTAVerified

O(k)-Equivariant Dimensionality Reduction on Stiefel Manifolds

2023-09-19Code Available0· sign in to hype

Andrew Lee, Harlin Lee, Jose A. Perea, Nikolas Schonsheck, Madeleine Weinstein

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Many real-world datasets live on high-dimensional Stiefel and Grassmannian manifolds, V_k(R^N) and Gr(k, R^N) respectively, and benefit from projection onto lower-dimensional Stiefel and Grassmannian manifolds. In this work, we propose an algorithm called Principal Stiefel Coordinates (PSC) to reduce data dimensionality from V_k(R^N) to V_k(R^n) in an O(k)-equivariant manner (k n N). We begin by observing that each element V_n(R^N) defines an isometric embedding of V_k(R^n) into V_k(R^N). Next, we describe two ways of finding a suitable embedding map : one via an extension of principal component analysis (_PCA), and one that further minimizes data fit error using gradient descent (_GD). Then, we define a continuous and O(k)-equivariant map _ that acts as a "closest point operator" to project the data onto the image of V_k(R^n) in V_k(R^N) under the embedding determined by , while minimizing distortion. Because this dimensionality reduction is O(k)-equivariant, these results extend to Grassmannian manifolds as well. Lastly, we show that __PCA globally minimizes projection error in a noiseless setting, while __GD achieves a meaningfully different and improved outcome when the data does not lie exactly on the image of a linearly embedded lower-dimensional Stiefel manifold as above. Multiple numerical experiments using synthetic and real-world data are performed.

Tasks

Reproductions