SOTAVerified

FastFace: Tuning Identity Preservation in Distilled Diffusion via Guidance and Attention

2025-05-27Code Available1· sign in to hype

Sergey Karpukhin, Vadim Titov, Andrey Kuznetsov, Aibek Alanov

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In latest years plethora of identity-preserving adapters for a personalized generation with diffusion models have been released. Their main disadvantage is that they are dominantly trained jointly with base diffusion models, which suffer from slow multi-step inference. This work aims to tackle the challenge of training-free adaptation of pretrained ID-adapters to diffusion models accelerated via distillation - through careful re-design of classifier-free guidance for few-step stylistic generation and attention manipulation mechanisms in decoupled blocks to improve identity similarity and fidelity, we propose universal FastFace framework. Additionally, we develop a disentangled public evaluation protocol for id-preserving adapters.

Reproductions