Integrating Pretrained Encoders for Generalized Face Frontalization
- Authors
- Choi, Wonyoung; Nam, Gi Pyo; Cho, Junghyun; Kim, Ig-Jae; Ko, Hyeong-Seok
- Issue Date
- 2024-03
- Publisher
- Institute of Electrical and Electronics Engineers Inc.
- Citation
- IEEE Access, v.12, pp.43530 - 43539
- Abstract
- In the field of face frontalization, the model obtained by training on a particular dataset often underperforms on other datasets. This paper presents the Pre-trained Feature Transformation GAN (PFT-GAN), which is designed to fully utilize diverse facial feature information available from pre-trained face recognition networks. For that purpose, we propose the use of the feature attention transformation (FAT) module that effectively transfers the low-level facial features to the facial generator. On the other hand, in the hope of reducing the pre-trained encoder dependency, we attempt a new FAT module organization that accommodates the features from all pre-trained face recognition networks employed. This paper attempts evaluating the proposed work using the "independent critic" as well as "dependent critic", which enables objective judgments. Experimental results show that the proposed method significantly improves the face frontalization performance and helps overcome the bias associated with each pre-trained face recognition network employed.
- Keywords
- Face frontalization; face pose normalization; face recognition; generative modeling
- ISSN
- 2169-3536
- URI
- https://pubs.kist.re.kr/handle/201004/149611
- DOI
- 10.1109/ACCESS.2024.3377220
- Appears in Collections:
- KIST Article > 2024
- Files in This Item:
There are no files associated with this item.
- Export
- RIS (EndNote)
- XLS (Excel)
- XML
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.