SOTAVerified

Texture Synthesis

The fundamental goal of example-based Texture Synthesis is to generate a texture, usually larger than the input, that faithfully captures all the visual characteristics of the exemplar, yet is neither identical to it, nor exhibits obvious unnatural looking artifacts.

Source: Non-Stationary Texture Synthesis by Adversarial Expansion

Papers

Showing 8190 of 280 papers

TitleStatusHype
3DTextureTransformer: Geometry Aware Texture Generation for Arbitrary Mesh Topology0
3DTopia: Large Text-to-3D Generation Model with Hybrid Diffusion PriorsCode4
DragTex: Generative Point-Based Texture Editing on 3D Mesh0
Minecraft-ify: Minecraft Style Image Generation with Text-guided Image Editing for In-Game Application0
CTGAN: Semantic-guided Conditional Texture Generator for 3D Shapes0
DressCode: Autoregressively Sewing and Generating Garments from Text Guidance0
TextureDreamer: Image-guided Texture Synthesis through Geometry-aware Diffusion0
Generating Non-Stationary Textures using Self-RectificationCode1
Text-Guided 3D Face Synthesis - From Generation to EditingCode1
UV-IDM: Identity-Conditioned Latent Diffusion Model for Face UV-Texture GenerationCode1
Show:102550
← PrevPage 9 of 28Next →

No leaderboard results yet.