Abstract
Traditional radar detection methods struggle to detect Low-Slow-Small (LSS) targets at long distances in the cluttered urban environment. To address this issue, we develop a Low-Slow-Small target Detection Network (LSSDNet). This network is capable of sharpening target features in the Range Doppler map (RDMap) while suppressing background clutter and noise. The backbone of the LSSDNet is a three-dimensional UNet (3DUNet). A Spatial Transformer (ST) is proposed to be embedded into each scale of the UNet to enhance the model's spatial feature perception capability. A Cross Denoising Attention (CDA) is proposed to filter clutter and noise information in the skip connection. To avoid the high cost of manually annotating radar datasets, an Optimized RDMap (ORDMap) is designed as the label, which is fully automated and requires no human involvement. The ORDMap is based on the 2D Gaussian distribution, making it more conducive to network training compared to the conventional label form of a one-hot vector. Considering that radar target detection involves multiple tasks such as classification and localization, we have defined a Radar Target Detection Loss (RTDLoss) function, which includes distinct types of loss. Furthermore, we propose a Point Confidence Factor (PCF) as a replacement for Intersection over Union (IoU) to characterize the radar localization capability. Due to the limited availability of publicly long-distance radar datasets, we independently create an Urban Low-altitude Centimeter wave Radar (ULCR) dataset and conduct comparative experiments with state-of-the-art methods. The results demonstrate that when the echoes of LSS targets are extremely weak, the detection performance of our method shows a strong advantage.
Original language | English |
---|---|
Journal | IEEE Sensors Journal |
DOIs | |
Publication status | Accepted/In press - 2025 |
Externally published | Yes |
Keywords
- LSS targets
- LSSDNet
- ORDMap
- radar target detection
- RDMap