Abstract
Building layout sensing of through-the-wall radar (TWR) plays a vital role in fields such as counter-terrorism operations and post-disaster rescue. Existing layout sensing methods based on TWR typically focus solely on either corner information or wall surface features, neglecting the complementarity between the two, which leads to low sensing accuracy in complex environments. To address this issue, we propose a Corner-Wall Sensing Network (CWSNet), a building layout sensing network that fuses corner and wall surface information. First, deep convolutional networks are used to extract wall and corner features from TWR images. Then, these complementary structural features are fused to form an integrated representation. Finally, a transformer-based dynamic graph reasoning module (DGRM) captures their spatial relationships, enabling high-precision layout sensing. Both simulated and real-world experimental datasets demonstrate that CWSNet significantly outperforms existing methods across multiple evaluation metrics, achieving superior wall localization accuracy and layout connectivity, while also exhibiting strong robustness and generalization capabilities.
| Original language | English |
|---|---|
| Pages (from-to) | 703-707 |
| Number of pages | 5 |
| Journal | IEEE Signal Processing Letters |
| Volume | 33 |
| DOIs | |
| Publication status | Published - 2026 |
Keywords
- Building layout sensing
- corner–wall fusion
- feature relationship learning
- through-the-wall radar (TWR)
- transformer-based dynamic graph reasoning module (DGRM)
Fingerprint
Dive into the research topics of 'CWSNet: A Building Layout Sensing Network With Corner and Wall Information Fusion From Through-the-Wall Radar'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver