TY - JOUR
T1 - RDR-KD
T2 - A Knowledge Distillation Detection Framework for Drone Scenes
AU - Huang, Jinxiang
AU - Chang, Hong
AU - Yang, Xin
AU - Liu, Yong
AU - Liu, Shuqi
AU - Song, Yong
N1 - Publisher Copyright:
© 2004-2012 IEEE.
PY - 2024
Y1 - 2024
N2 - Drone object detection (DOD) with real-time deployment is a research hotspot. On the one hand, the performance of tiny object detection is closely related to the ground detection capability of the drone platform. Existing methods are keen on designing complex networks to enhance the accuracy of tiny objects, which significantly increases computational costs. On the other hand, the limited drone hardware resources urgently require lightweight models for deployment. To address the dilemma of balancing detection accuracy and computational efficiency, we propose a regenerated-decoupled-responsive knowledge distillation (RDR-KD) framework specifically for drone scenes. First, we design the Regenerated Distillation and the Decoupled Distillation to fully transfer the tiny object feature information from the teacher model to the student model. Meanwhile, we devise the logit-based Responsive Distillation based on focal loss and efficient intersection over union (EIoU) to alleviate class imbalance. Finally, we conduct extensive experiments on the VisDrone2019 dataset. The experimental results demonstrate that the proposed RDR-KD framework improves AP and {mathrm{AP}}-{S} of the student model by 3.3% and 2.9% respectively, which outperforms other state-of-the-art distillation frameworks.
AB - Drone object detection (DOD) with real-time deployment is a research hotspot. On the one hand, the performance of tiny object detection is closely related to the ground detection capability of the drone platform. Existing methods are keen on designing complex networks to enhance the accuracy of tiny objects, which significantly increases computational costs. On the other hand, the limited drone hardware resources urgently require lightweight models for deployment. To address the dilemma of balancing detection accuracy and computational efficiency, we propose a regenerated-decoupled-responsive knowledge distillation (RDR-KD) framework specifically for drone scenes. First, we design the Regenerated Distillation and the Decoupled Distillation to fully transfer the tiny object feature information from the teacher model to the student model. Meanwhile, we devise the logit-based Responsive Distillation based on focal loss and efficient intersection over union (EIoU) to alleviate class imbalance. Finally, we conduct extensive experiments on the VisDrone2019 dataset. The experimental results demonstrate that the proposed RDR-KD framework improves AP and {mathrm{AP}}-{S} of the student model by 3.3% and 2.9% respectively, which outperforms other state-of-the-art distillation frameworks.
KW - Drone object detection (DOD)
KW - feature distillation
KW - knowledge distillation (KD)
KW - responsive distillation
UR - http://www.scopus.com/inward/record.url?scp=85192985788&partnerID=8YFLogxK
U2 - 10.1109/LGRS.2024.3398140
DO - 10.1109/LGRS.2024.3398140
M3 - Article
AN - SCOPUS:85192985788
SN - 1545-598X
VL - 21
SP - 1
EP - 5
JO - IEEE Geoscience and Remote Sensing Letters
JF - IEEE Geoscience and Remote Sensing Letters
M1 - 6008705
ER -