SOTAVerified

Texture Synthesis

The fundamental goal of example-based Texture Synthesis is to generate a texture, usually larger than the input, that faithfully captures all the visual characteristics of the exemplar, yet is neither identical to it, nor exhibits obvious unnatural looking artifacts.

Source: Non-Stationary Texture Synthesis by Adversarial Expansion

Papers

Showing 2130 of 280 papers

TitleStatusHype
Tactile DreamFusion: Exploiting Tactile Sensing for 3D GenerationCode2
Real-World Blind Super-Resolution via Feature Matching with Implicit High-Resolution PriorsCode2
DiffTF++: 3D-aware Diffusion Transformer for Large-Vocabulary 3D GenerationCode2
Pretraining is All You Need for Image-to-Image TranslationCode2
TetWeave: Isosurface Extraction using On-The-Fly Delaunay Tetrahedral Grids for Gradient-Based Mesh OptimizationCode2
Texture Generation on 3D Meshes with Point-UV DiffusionCode2
Conceptual Compression via Deep Structure and Texture SynthesisCode1
Human Parsing Based Texture Transfer from Single Image to 3D Human via Cross-View ConsistencyCode1
Combining Markov Random Fields and Convolutional Neural Networks for Image SynthesisCode1
Everything's Talkin': Pareidolia Face ReenactmentCode1
Show:102550
← PrevPage 3 of 28Next →

No leaderboard results yet.