SOTAVerified

Hyper-Transforming Latent Diffusion Models

2025-04-23Unverified0· sign in to hype

Ignacio Peis, Batuhan Koyuncu, Isabel Valera, Jes Frellsen

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We introduce a novel generative framework for functions by integrating Implicit Neural Representations (INRs) and Transformer-based hypernetworks into latent variable models. Unlike prior approaches that rely on MLP-based hypernetworks with scalability limitations, our method employs a Transformer-based decoder to generate INR parameters from latent variables, addressing both representation capacity and computational efficiency. Our framework extends latent diffusion models (LDMs) to INR generation by replacing standard decoders with a Transformer-based hypernetwork, which can be trained either from scratch or via hyper-transforming-a strategy that fine-tunes only the decoder while freezing the pre-trained latent space. This enables efficient adaptation of existing generative models to INR-based representations without requiring full retraining.

Tasks

Reproductions