TY - GEN
T1 - An Efficient Marine Species Classifier via Knowledge Distillation and Hybrid Attention
AU - Zhu, Kaishi
AU - Chu, Xiaohui
AU - Liu, Yutao
AU - Hu, Runze
AU - Xu, Lijun
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Marine species recognition is a crucial task in ocean exploration. Despite the impressive performance of deep-learning-based classification methods, these approaches often suffer from high complexity in terms of model parameters, storage size, and FLOPs. This complexity leads to slow inference and high consumption of computational resources, which presents a significant challenge for ocean engineering. To address this challenge, we propose a lightweight framework for marine species recognition based on knowledge distillation and the attention mechanism. Our method introduces a relatively complex teacher classifier with the backbone of ResNet-18, and an attention module to enhance its performance. We design the student classifier using only a few convolutional layers, thereby significantly reducing the number of parameters. Extensive experiments demonstrate that the student network achieves competitive performance to other representative lightweight models such as ShuffleNetv2 and MobileNetv3-small, while its model parameters are only 1.624% of the MobileNetv3-small.
AB - Marine species recognition is a crucial task in ocean exploration. Despite the impressive performance of deep-learning-based classification methods, these approaches often suffer from high complexity in terms of model parameters, storage size, and FLOPs. This complexity leads to slow inference and high consumption of computational resources, which presents a significant challenge for ocean engineering. To address this challenge, we propose a lightweight framework for marine species recognition based on knowledge distillation and the attention mechanism. Our method introduces a relatively complex teacher classifier with the backbone of ResNet-18, and an attention module to enhance its performance. We design the student classifier using only a few convolutional layers, thereby significantly reducing the number of parameters. Extensive experiments demonstrate that the student network achieves competitive performance to other representative lightweight models such as ShuffleNetv2 and MobileNetv3-small, while its model parameters are only 1.624% of the MobileNetv3-small.
KW - Attention Mechanism
KW - Deep Learning
KW - Knowledge Distillation
KW - Ocean Engineering
UR - http://www.scopus.com/inward/record.url?scp=85187372700&partnerID=8YFLogxK
U2 - 10.1109/SWC57546.2023.10449174
DO - 10.1109/SWC57546.2023.10449174
M3 - Conference contribution
AN - SCOPUS:85187372700
T3 - Proceedings - 2023 IEEE SmartWorld, Ubiquitous Intelligence and Computing, Autonomous and Trusted Vehicles, Scalable Computing and Communications, Digital Twin, Privacy Computing and Data Security, Metaverse, SmartWorld/UIC/ATC/ScalCom/DigitalTwin/PCDS/Metaverse 2023
BT - Proceedings - 2023 IEEE SmartWorld, Ubiquitous Intelligence and Computing, Autonomous and Trusted Vehicles, Scalable Computing and Communications, Digital Twin, Privacy Computing and Data Security, Metaverse, SmartWorld/UIC/ATC/ScalCom/DigitalTwin/PCDS/Metaverse 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 9th IEEE Smart World Congress, SWC 2023
Y2 - 28 August 2023 through 31 August 2023
ER -