Abstract
Style transfer is a process of migrating a style from a given image to the content of another, synthesizing a new image, which is an artistic mixture of the two. Recent work on this problem adopting convolutional neural-networks (CNN) ignited a renewed interest in this field, due to the very impressive results obtained. There exists an alternative path toward handling the style transfer task, via the generalization of texture synthesis algorithms. This approach has been proposed over the years, but its results are typically less impressive compared with the CNN ones. In this paper, we propose a novel style transfer algorithm that extends the texture synthesis work of Kwatra et al. (2005), while aiming to get stylized images that are closer in quality to the CNN ones. We modify Kwatra's algorithm in several key ways in order to achieve the desired transfer, with emphasis on a consistent way for keeping the content intact in selected regions, while producing hallucinated and rich style in others. The results obtained are visually pleasing and diverse, shown to be competitive with the recent CNN style transfer algorithms. The proposed algorithm is fast and flexible, being able to process any pair of content + style images.
Original language | English |
---|---|
Article number | 7874180 |
Pages (from-to) | 2338-2351 |
Number of pages | 14 |
Journal | IEEE Transactions on Image Processing |
Volume | 26 |
Issue number | 5 |
DOIs | |
State | Published - May 2017 |
Keywords
- Style transfer
- convolutional neural networks
- image segmentation
- patch matching
- segmentation
- texture synthesis
- tree nearest-neighbor
All Science Journal Classification (ASJC) codes
- Software
- Computer Graphics and Computer-Aided Design