SOTAVerified

Texture Synthesis

The fundamental goal of example-based Texture Synthesis is to generate a texture, usually larger than the input, that faithfully captures all the visual characteristics of the exemplar, yet is neither identical to it, nor exhibits obvious unnatural looking artifacts.

Source: Non-Stationary Texture Synthesis by Adversarial Expansion

Papers

Showing 110 of 280 papers

TitleStatusHype
Hunyuan3D 2.0: Scaling Diffusion Models for High Resolution Textured 3D Assets GenerationCode11
3DTopia: Large Text-to-3D Generation Model with Hybrid Diffusion PriorsCode4
Hunyuan3D 2.5: Towards High-Fidelity 3D Assets Generation with Ultimate DetailsCode3
Pandora3D: A Comprehensive Framework for High-Quality 3D Shape and Texture GenerationCode3
TEXGen: a Generative Diffusion Model for Mesh TexturesCode3
Generic 3D Diffusion Adapter Using Controlled Multi-View EditingCode3
AvatarCLIP: Zero-Shot Text-Driven Generation and Animation of 3D AvatarsCode3
Aggregated Contextual Transformations for High-Resolution Image InpaintingCode3
Texture Memory-Augmented Deep Patch-Based Image InpaintingCode3
UniTEX: Universal High Fidelity Generative Texturing for 3D ShapesCode2
Show:102550
← PrevPage 1 of 28Next →

No leaderboard results yet.