We propose a new technique for visual attribute transfer across images thatmay have very different appearance but have perceptually similar semanticstructure. By visual attribute transfer, we mean transfer of visual information(such as color, tone, texture, and style) from one image to another. Forexample, one image could be that of a painting or a sketch while the other is aphoto of a real scene, and both depict the same type of scene. Our technique finds semantically-meaningful dense correspondences between twoinput images. To accomplish this, it adapts the notion of "image analogy" withfeatures extracted from a Deep Convolutional Neutral Network for matching; wecall our technique Deep Image Analogy. A coarse-to-fine strategy is used tocompute the nearest-neighbor field for generating the results. We validate theeffectiveness of our proposed method in a variety of cases, includingstyle/texture transfer, color/style swap, sketch/painting to photo, and timelapse.
translated by 谷歌翻译