SOTAVerified

Texture Synthesis

The fundamental goal of example-based Texture Synthesis is to generate a texture, usually larger than the input, that faithfully captures all the visual characteristics of the exemplar, yet is neither identical to it, nor exhibits obvious unnatural looking artifacts.

Source: Non-Stationary Texture Synthesis by Adversarial Expansion

Papers

Showing 121130 of 280 papers

TitleStatusHype
Generative AI meets 3D: A Survey on Text-to-3D in AIGC Era0
TUVF: Learning Generalizable Texture UV Radiance Fields0
Generating Texture for 3D Human Avatar from a Single Image using Sampling and Refinement Networks0
Semantic Image Translation for Repairing the Texture Defects of Building Models0
Text2Tex: Text-driven Texture Synthesis via Diffusion ModelsCode1
3DGen: Triplane Latent Diffusion for Textured Mesh GenerationCode2
A geometrically aware auto-encoder for multi-texture synthesisCode0
Neural Texture Synthesis With Guided CorrespondenceCode1
ClipFace: Text-guided Editing of Textured 3D Morphable ModelsCode1
DyNCA: Real-time Dynamic Texture Synthesis Using Neural Cellular Automata0
Show:102550
← PrevPage 13 of 28Next →

No leaderboard results yet.