CWSNet: A Building Layout Sensing Network with Corner and Wall Information Fusion from Through-the-Wall Radar

Research output: Contribution to journalArticlepeer-review

Abstract

Building layout sensing of through-the-wall radar (TWR) plays a vital role in fields such as counter-terrorism operations and post-disaster rescue. Existing layout sensing methods based on TWR typically focus solely on either corner information or wall surface features, neglecting the complementarity between the two, which leads to low sensing accuracy in complex environments. To address this issue, we propose a Corner-Wall Sensing Network (CWSNet), a building layout sensing network that fuses corner and wall surface information. First, deep convolutional networks are used to extract wall and corner features from TWR images. Then, these complementary structural features are fused to form an integrated representation. Finally, a transformer-based dynamic graph reasoning module (DGRM) captures their spatial relationships, enabling high-precision layout sensing. Both simulated and real-world experimental datasets demonstrate that CWSNet significantly outperforms existing methods across multiple evaluation metrics, achieving superior wall localization accuracy and layout connectivity, while also exhibiting strong robustness and generalization capabilities.

Original languageEnglish
JournalIEEE Signal Processing Letters
DOIs
Publication statusAccepted/In press - 2026

Keywords

  • Building layout sensing
  • corner–wall fusion
  • feature relationship learning
  • through-the-wall radar (TWR)
  • transformer-based dynamic graph reasoning module (DGRM)

Fingerprint

Dive into the research topics of 'CWSNet: A Building Layout Sensing Network with Corner and Wall Information Fusion from Through-the-Wall Radar'. Together they form a unique fingerprint.

Cite this