Identity-preserving Face Recovery from Portraits
release_ufixln6lkfdj5a5xa6qzzxsrxq
by
Fatemeh Shiri, Xin Yu, Fatih Porikli, Richard Hartley, Piotr Koniusz
2018
Abstract
Recovering the latent photorealistic faces from their artistic portraits aids
human perception and facial analysis. However, a recovery process that can
preserve identity is challenging because the fine details of real faces can be
distorted or lost in stylized images. In this paper, we present a new
Identity-preserving Face Recovery from Portraits (IFRP) to recover latent
photorealistic faces from unaligned stylized portraits. Our IFRP method
consists of two components: Style Removal Network (SRN) and Discriminative
Network (DN). The SRN is designed to transfer feature maps of stylized images
to the feature maps of the corresponding photorealistic faces. By embedding
spatial transformer networks into the SRN, our method can compensate for
misalignments of stylized faces automatically and output aligned realistic face
images. The role of the DN is to enforce recovered faces to be similar to
authentic faces. To ensure the identity preservation, we promote the recovered
and ground-truth faces to share similar visual features via a distance measure
which compares features of recovered and ground-truth faces extracted from a
pre-trained VGG network. We evaluate our method on a large-scale synthesized
dataset of real and stylized face pairs and attain state of the art results. In
addition, our method can recover photorealistic faces from previously unseen
stylized portraits, original paintings and human-drawn sketches.
In text/plain
format
Archived Files and Locations
application/pdf 3.1 MB
file_dk7xwl4g2favvnctzj6jybc7by
|
arxiv.org (repository) web.archive.org (webarchive) |
1801.02279v2
access all versions, variants, and formats of this works (eg, pre-prints)