Makeup Style Transfer on Low-quality Images with Weighted Multi-scale Attention

Organisciak, Daniel, Ho, Edmond and Shum, Hubert (2021) Makeup Style Transfer on Low-quality Images with Weighted Multi-scale Attention. In: 2020 25th International Conference on Pattern Recognition (ICPR 2020): Milan, Italy, 10-15 January 2021. Proceedings of the International Conference on Pattern Recognition (ICPR) . IEEE, Piscataway, NJ, pp. 6011-6018. ISBN 9781728188096, 9781728188089

[img]
Preview
Text
Makeup_Style_Transfer.pdf - Accepted Version

Download (1MB) | Preview
Official URL: https://doi.org/10.1109/icpr48806.2021.9412604

Abstract

Facial makeup style transfer is an extremely challenging sub-field of image-to-image-translation. Due to this difficulty, state-of-the-art results are mostly reliant on the Face Parsing Algorithm, which segments a face into parts in order to easily extract makeup features. However, this algorithm can only work well on high-definition images where facial features can be accurately extracted. Faces in many real-world photos, such as those including a large background or multiple people, are typically of low-resolution, which considerably hinders state-of-the-art algorithms. In this paper, we propose an end-to-end holistic approach to effectively transfer makeup styles between two low-resolution images. The idea is built upon a novel weighted multi-scale spatial attention module, which identifies salient pixel regions on low-resolution images in multiple scales, and uses channel attention to determine the most effective attention map. This design provides two benefits: low-resolution images are usually blurry to different extents, so a multi-scale architecture can select the most effective convolution kernel size to implement spatial attention; makeup is applied on both a macro-level (foundation, fake tan) and a micro-level (eyeliner, lipstick) so different scales can excel in extracting different makeup features. We develop an Augmented CycleGAN network that embeds our attention modules at selected layers to most effectively transfer makeup. Our system is tested with the FBD data set, which consists of many low-resolution facial images, and demonstrate that it outperforms state-of-the-art methods, particularly in transferring makeup for blurry images and partially occluded images.

Item Type: Book Section
Additional Information: Funding information: The project is funded in part by the Royal Society (Ref: IES\R2\181024 and IES\R1\191147).
Subjects: G400 Computer Science
Department: Faculties > Engineering and Environment > Computer and Information Sciences
Depositing User: John Coen
Date Deposited: 07 Dec 2020 11:55
Last Modified: 31 Jul 2021 10:35
URI: http://nrl.northumbria.ac.uk/id/eprint/44928

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics