Improving reflectance reconstruction from tristimulus values by adaptively combining colorimetric and reflectance similarities

Bin Cao, Ningfang Liao*, Yasheng Li, Haobo Cheng

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

12 Citations (Scopus)

Abstract

The use of spectral reflectance as fundamental color information finds application in diverse fields related to imaging. Many approaches use training sets to train the algorithm used for color classification. In this context, we note that the modification of training sets obviously impacts the accuracy of reflectance reconstruction based on classical reflectance reconstruction methods. Different modifying criteria are not always consistent with each other, since they have different emphases; spectral reflectance similarity focuses on the deviation of reconstructed reflectance, whereas colorimetric similarity emphasizes human perception. We present a method to improve the accuracy of the reconstructed spectral reflectance by adaptively combining colorimetric and spectral reflectance similarities. The different exponential factors of the weighting coefficients were investigated. The spectral reflectance reconstructed by the proposed method exhibits considerable improvements in terms of the root-mean-square error and goodness-of-fit coefficient of the spectral reflectance errors as well as color differences under different illuminants. Our method is applicable to diverse areas such as textiles, printing, art, and other industries.

Original languageEnglish
Article number053104
JournalOptical Engineering
Volume56
Issue number5
DOIs
Publication statusPublished - 1 May 2017

Keywords

  • Adaptive approach
  • Colorimetric similarity
  • Reflectance
  • Reflectance reconstruction
  • Reflectance similarity

Fingerprint

Dive into the research topics of 'Improving reflectance reconstruction from tristimulus values by adaptively combining colorimetric and reflectance similarities'. Together they form a unique fingerprint.

Cite this