SOTAVerified

Scaling Continuous Kernels with Sparse Fourier Domain Learning

2024-09-15Unverified0· sign in to hype

Clayton Harper, Luke Wood, Peter Gerstoft, Eric C. Larson

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We address three key challenges in learning continuous kernel representations: computational efficiency, parameter efficiency, and spectral bias. Continuous kernels have shown significant potential, but their practical adoption is often limited by high computational and memory demands. Additionally, these methods are prone to spectral bias, which impedes their ability to capture high-frequency details. To overcome these limitations, we propose a novel approach that leverages sparse learning in the Fourier domain. Our method enables the efficient scaling of continuous kernels, drastically reduces computational and memory requirements, and mitigates spectral bias by exploiting the Gibbs phenomenon.

Tasks

Reproductions