Style-aware and multi-scale attention for face image completion
CSTR:
Author:
Affiliation:

(1.School of Electrical and Information Engineering, Tianjin University, Tianjin 300072, China; 2.State Key Laboratory of Digital Publishing Technology, Beijing 100871, China)

Clc Number:

TN911.73

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Face image completion is an important image processing technique for reconstructing face images in the field of computer vision. The existing face image completion methods have the problem of unreasonable global semantics, which is mainly due to the lack of long-range transfer capability of the existing techniques that they are unable to reasonably transfer information from known regions in a broken image to occluded regions. To overcome the problem, a novel encoder-decoder face image completion network integrating style-aware and multi-scale attention was proposed under the framework of generative adversarial network (GAN). Specifically, the style-aware module was used to extract the global semantic information of an image, and the extracted information was employed to globally adjust the completion processing by rendering the encoding of the image level by level. The multi-scale attention module extracted patches of multi-scale features and performed a long-range transfer via matrix multiplication between a shared attention score and the extracted patches. Experimental results from the public dataset CelebA-HQ show that the style-aware module and the multi-scale attention module greatly enhanced the long-range transfer capability of the completion network. Compared with the existing state-of-the-art face image completion methods, the proposed model had significant improvement in various evaluation metrics. Meanwhile, the global semantics of the completion results were more reasonable and the completion effect was more natural under low lighting conditions.

    Reference
    Related
    Cited by
Get Citation
Related Videos

Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:October 08,2020
  • Revised:
  • Adopted:
  • Online: April 25,2022
  • Published:
Article QR Code