SOTAVerified

Texture Synthesis

The fundamental goal of example-based Texture Synthesis is to generate a texture, usually larger than the input, that faithfully captures all the visual characteristics of the exemplar, yet is neither identical to it, nor exhibits obvious unnatural looking artifacts.

Source: Non-Stationary Texture Synthesis by Adversarial Expansion

Papers

Showing 231240 of 280 papers

TitleStatusHype
Semantic Image Translation for Repairing the Texture Defects of Building Models0
Single Mesh Diffusion Models with Field Latents for Texture Generation0
STS-GAN: Can We Synthesize Solid Texture with High Fidelity from Arbitrary 2D Exemplar?0
Spatial Degradation-Aware and Temporal Consistent Diffusion Model for Compressed Video Super-Resolution0
Step1X-3D: Towards High-Fidelity and Controllable Generation of Textured 3D Assets0
Stochastic Geometry Models for Texture Synthesis of Machined Metallic Surfaces: Sandblasting and Milling0
Non-Stationary Texture Synthesis by Adversarial ExpansionCode0
Style-Transfer via Texture-SynthesisCode0
Two-Stream Convolutional Networks for Dynamic Texture SynthesisCode0
Co-occurrence Based Texture SynthesisCode0
Show:102550
← PrevPage 24 of 28Next →

No leaderboard results yet.