SOTAVerified

Texture Synthesis

The fundamental goal of example-based Texture Synthesis is to generate a texture, usually larger than the input, that faithfully captures all the visual characteristics of the exemplar, yet is neither identical to it, nor exhibits obvious unnatural looking artifacts.

Source: Non-Stationary Texture Synthesis by Adversarial Expansion

Papers

Showing 121130 of 280 papers

TitleStatusHype
Dual Pipeline Style Transfer with Input Distribution Differentiation0
InTeX: Interactive Text-to-texture Synthesis via Unified Depth-aware Inpainting0
Color and Texture Dual Pipeline Lightweight Style Transfer0
A Procedural Texture Generation Framework Based on Semantic Descriptions0
Learning an Action-Conditional Model for Haptic Texture Generation0
Learning in a Single Domain for Non-Stationary Multi-Texture Synthesis0
DTSGAN: Learning Dynamic Textures via Spatiotemporal Generative Adversarial Network0
HumanNorm: Learning Normal Diffusion Model for High-quality and Realistic 3D Human Generation0
DressCode: Autoregressively Sewing and Generating Garments from Text Guidance0
Dress-1-to-3: Single Image to Simulation-Ready 3D Outfit with Diffusion Prior and Differentiable Physics0
Show:102550
← PrevPage 13 of 28Next →

No leaderboard results yet.