SOTAVerified

Uniform Memory Retrieval with Larger Capacity for Modern Hopfield Models

2024-04-04Code Available0· sign in to hype

Dennis Wu, Jerry Yao-Chieh Hu, Teng-Yun Hsiao, Han Liu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We propose a two-stage memory retrieval dynamics for modern Hopfield models, termed U-Hop, with enhanced memory capacity. Our key contribution is a learnable feature map which transforms the Hopfield energy function into kernel space. This transformation ensures convergence between the local minima of energy and the fixed points of retrieval dynamics within the kernel space. Consequently, the kernel norm induced by serves as a novel similarity measure. It utilizes the stored memory patterns as learning data to enhance memory capacity across all modern Hopfield models. Specifically, we accomplish this by constructing a separation loss L_ that separates the local minima of kernelized energy by separating stored memory patterns in kernel space. Methodologically, U-Hop memory retrieval process consists of: (Stage I) minimizing separation loss for a more uniform memory (local minimum) distribution, followed by (Stage II) standard Hopfield energy minimization for memory retrieval. This results in a significant reduction of possible metastable states in the Hopfield energy function, thus enhancing memory capacity by preventing memory confusion. Empirically, with real-world datasets, we demonstrate that U-Hop outperforms all existing modern Hopfield models and state-of-the-art similarity measures, achieving substantial improvements in both associative memory retrieval and deep learning tasks. Code is available at https://github.com/MAGICS-LAB/UHop ; future updates are on arXiv:2404.03827

Tasks

Reproductions