A Deep Facial BRDF Estimation Method Based on Image Translation

Lulu Feng, Dongdong Weng*, Bin Liang

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

1 Citation (Scopus)
Plum Print visual indicator of research metrics
  • Citations
    • Citation Indexes: 1
  • Captures
    • Readers: 1
see details

Abstract

The reconstruction of photorealistic 3D face geometry, textures and reflectance (BRDF) is one of the most popular fields in computer vision, graphics and machine learning. However, the acquisition of facial reflectance remains a challenge. In this article, we propose a method for estimating the facial reflection properties of a single portrait image based on image translation. From a RGB face image, we obtain the BRDF with a large amount of detail. To achieve it, we perform a reverse engineer, which renders face images with the obtained texture map to form training data pairs based on the Blinn-Phong illumination model. We also apply random rotate-and-crop and sliding-window-crop to augment the data and optimize the network weights by minimizing the generated adversarial loss and reconstruction loss. As demonstrated in a chain of quantitative and qualitative experiments, our method achieves superior performance compared to the state-of-the-art methods.

Original languageEnglish
Article number012011
JournalJournal of Physics: Conference Series
Volume2363
Issue number1
DOIs
Publication statusPublished - 2022
Event2022 4th International Conference on Artificial Intelligence and Computer Science, AICS 2022 - Beijing, China
Duration: 30 Jul 202231 Jul 2022

Keywords

  • 3D face reconstruction
  • BRDF estimation
  • cGAN
  • image translation

Fingerprint

Dive into the research topics of 'A Deep Facial BRDF Estimation Method Based on Image Translation'. Together they form a unique fingerprint.

Cite this

Feng, L., Weng, D., & Liang, B. (2022). A Deep Facial BRDF Estimation Method Based on Image Translation. Journal of Physics: Conference Series, 2363(1), Article 012011. https://doi.org/10.1088/1742-6596/2363/1/012011