A solver of Fukunaga koontz transformation without matrix decomposition

Hao Su, Jie Yang, Lei Sun, Zhiping Lin

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Fukunaga Koontz Transformation provides a powerful tool for extracting discriminant subspaces in pattern classification. The discriminant subspaces are generally extracted by a matrix decomposition procedure involving scatter matrices where a nontrivial singularity problem is inevitable when sample number is limited. In this work, instead of matrix decomposition, a novel subspace extraction procedure based on solving a set of least-norm equations is proposed. This subspace extraction procedure does not rely on a large sample number and its computational complexity is only related to the number of samples. Experiments based on benchmark MNIST and PIE face recognition datasets show a promising potential of using the proposed method for certain image based recognition application where the image size is large while the sample number is limited.

Original languageEnglish
Title of host publication2021 IEEE International Symposium on Circuits and Systems, ISCAS 2021 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728192017
DOIs
Publication statusPublished - 2021
Event53rd IEEE International Symposium on Circuits and Systems, ISCAS 2021 - Daegu, Korea, Republic of
Duration: 22 May 202128 May 2021

Publication series

NameProceedings - IEEE International Symposium on Circuits and Systems
Volume2021-May
ISSN (Print)0271-4310

Conference

Conference53rd IEEE International Symposium on Circuits and Systems, ISCAS 2021
Country/TerritoryKorea, Republic of
CityDaegu
Period22/05/2128/05/21

Keywords

  • Binary classification
  • Face recognition
  • Fukunaga koontz transformation
  • Subspace analysis

Fingerprint

Dive into the research topics of 'A solver of Fukunaga koontz transformation without matrix decomposition'. Together they form a unique fingerprint.

Cite this