TY - GEN
T1 - Generating Non-Stationary Textures Using Self-Rectification
AU - Zhou, Yang
AU - Xiao, Rongjun
AU - Lischinski, Dani
AU - Cohen-Or, Daniel
AU - Huang, Hui
N1 - Publisher Copyright: © 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - This paper addresses the challenge of example-based non-stationary texture synthesis. We introduce a novel two-step approach wherein users first modify a reference texture using standard image editing tools, yielding an initial rough target for the synthesis. Subsequently, our proposed method, termed 'self-rectification', automatically refines this target into a coherent, seamless texture, while faithfully preserving the distinct visual characteristics of the reference exemplar. Our method leverages a pretrained diffusion network, and uses self-attention mechanisms, to grad-ually align the synthesized texture with the reference, en-suring the retention of the structures in the provided target. Through experimental validation, our approach ex-hibits exceptional proficiency in handling non-stationary textures, demonstrating significant advancements in texture synthesis when compared to existing state-of-the-art techniques. Code is available at https://github.com/xiaorongjun000/Self-Rectification
AB - This paper addresses the challenge of example-based non-stationary texture synthesis. We introduce a novel two-step approach wherein users first modify a reference texture using standard image editing tools, yielding an initial rough target for the synthesis. Subsequently, our proposed method, termed 'self-rectification', automatically refines this target into a coherent, seamless texture, while faithfully preserving the distinct visual characteristics of the reference exemplar. Our method leverages a pretrained diffusion network, and uses self-attention mechanisms, to grad-ually align the synthesized texture with the reference, en-suring the retention of the structures in the provided target. Through experimental validation, our approach ex-hibits exceptional proficiency in handling non-stationary textures, demonstrating significant advancements in texture synthesis when compared to existing state-of-the-art techniques. Code is available at https://github.com/xiaorongjun000/Self-Rectification
KW - diffusion network
KW - Non-stationary Textures
KW - Self-attention mechanism
KW - Texture Synthesis
UR - http://www.scopus.com/inward/record.url?scp=85207289600&partnerID=8YFLogxK
U2 - https://doi.org/10.1109/CVPR52733.2024.00742
DO - https://doi.org/10.1109/CVPR52733.2024.00742
M3 - منشور من مؤتمر
T3 - Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
SP - 7767
EP - 7776
BT - Proceedings - 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024
PB - IEEE Computer Society
T2 - 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024
Y2 - 16 June 2024 through 22 June 2024
ER -