TY - GEN
T1 - Progressive Self-Guided Hardness Distillation for Fine-Grained Visual Classification
AU - Wang, Yangdi
AU - Guo, Wenming
AU - Xu, Su Xiu
AU - Yuan, Shuozhi
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Fine-grained visual classification (FGVC) is a challenging problem due to its inherently high intra-class variances and low inter-class variances. Recently, vision transformers (ViTs) have demonstrated their powerful performance in both traditional and FGVC cases. However, compared with most categories, some categories are difficult to classify because of their similar characteristics and postures and their strong background interference, impacting the performance improvement achieved by the utilizd model. In this work, we present a novel method named progressive self-guided hardness distillation (PS-GHD), which defines a classification hardness judgement criterion and utilizes different approaches for the corresponding categories according to this criterion. This method gradually and correctly classifies various categories in three stages through knowledge distillation, so it can correctly classify indistinguishable categories to some extent. We demonstrate the value of PS-GHD by experimenting on four popular fine-grained benchmarks: CUB-200-2011, Nabirds, Stanford Cars, and Stanford Dogs. Our method achieves very competitive results on the four datasets. We also present qualitative results to enhance the interpretability of our model.
AB - Fine-grained visual classification (FGVC) is a challenging problem due to its inherently high intra-class variances and low inter-class variances. Recently, vision transformers (ViTs) have demonstrated their powerful performance in both traditional and FGVC cases. However, compared with most categories, some categories are difficult to classify because of their similar characteristics and postures and their strong background interference, impacting the performance improvement achieved by the utilizd model. In this work, we present a novel method named progressive self-guided hardness distillation (PS-GHD), which defines a classification hardness judgement criterion and utilizes different approaches for the corresponding categories according to this criterion. This method gradually and correctly classifies various categories in three stages through knowledge distillation, so it can correctly classify indistinguishable categories to some extent. We demonstrate the value of PS-GHD by experimenting on four popular fine-grained benchmarks: CUB-200-2011, Nabirds, Stanford Cars, and Stanford Dogs. Our method achieves very competitive results on the four datasets. We also present qualitative results to enhance the interpretability of our model.
KW - Fine-grained visual classification
KW - Hardness judgment
KW - Knowledge distillation
UR - http://www.scopus.com/inward/record.url?scp=85205031925&partnerID=8YFLogxK
U2 - 10.1109/IJCNN60899.2024.10650553
DO - 10.1109/IJCNN60899.2024.10650553
M3 - Conference contribution
AN - SCOPUS:85205031925
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2024 International Joint Conference on Neural Networks, IJCNN 2024 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 International Joint Conference on Neural Networks, IJCNN 2024
Y2 - 30 June 2024 through 5 July 2024
ER -