TY - JOUR
T1 - SigGen
T2 - Signal Generation for Wireless Sensing Based on Disentangled Representation
AU - He, Hanxiang
AU - Huan, Xintao
AU - Luo, Yong
AU - Fan, Rongfei
AU - Xu, Jie
AU - Hu, Han
N1 - Publisher Copyright:
© 2002-2012 IEEE.
PY - 2025
Y1 - 2025
N2 - With the thriving artificial intelligence-generated content (AIGC), it is becoming increasingly appealing to exploit generative AI to generate wireless signals for facilitating wireless sensing. However, this is a challenging task, as wireless signals are highly random in general and contain rich physical information. To tackle these challenges, we propose a novel signal disentanglement and generation framework termed SigGen, which is inspired by the Fourier Transform (FT) that converts signals to the frequency domain and accordingly separates objectives by distinct frequency bands. In our proposed framework, we first disentangle the features of objects embedded in the signal and subsequently modify these features to generate the desired signals. Specifically, we devise a neural network based on the vision transformer (ViT) to extract effective features for signal generation. In this neural network, we incorporate both local and global frequency attention modules to adaptively leverage frequency features, and introduce a hybrid patch embedding module to enhance information interaction for the ViT architecture. Furthermore, we propose a novel sequential training method to improve the disentanglement and generation capability of the neural network. Finally, extensive experiments on two benchmark public wireless sensing datasets demonstrate that our framework can effectively decouple wireless signals and generate diverse signals closely resembling real ones, surpassing state-of-the-art methods by 30.83%. A practical case study further demonstrates that our framework can be used as a data augmentation method to improve gesture recognition accuracy by 12.74%.
AB - With the thriving artificial intelligence-generated content (AIGC), it is becoming increasingly appealing to exploit generative AI to generate wireless signals for facilitating wireless sensing. However, this is a challenging task, as wireless signals are highly random in general and contain rich physical information. To tackle these challenges, we propose a novel signal disentanglement and generation framework termed SigGen, which is inspired by the Fourier Transform (FT) that converts signals to the frequency domain and accordingly separates objectives by distinct frequency bands. In our proposed framework, we first disentangle the features of objects embedded in the signal and subsequently modify these features to generate the desired signals. Specifically, we devise a neural network based on the vision transformer (ViT) to extract effective features for signal generation. In this neural network, we incorporate both local and global frequency attention modules to adaptively leverage frequency features, and introduce a hybrid patch embedding module to enhance information interaction for the ViT architecture. Furthermore, we propose a novel sequential training method to improve the disentanglement and generation capability of the neural network. Finally, extensive experiments on two benchmark public wireless sensing datasets demonstrate that our framework can effectively decouple wireless signals and generate diverse signals closely resembling real ones, surpassing state-of-the-art methods by 30.83%. A practical case study further demonstrates that our framework can be used as a data augmentation method to improve gesture recognition accuracy by 12.74%.
KW - feature disentanglement
KW - signal generation
KW - Wireless sensing
UR - https://www.scopus.com/pages/publications/105012752412
U2 - 10.1109/TWC.2025.3594547
DO - 10.1109/TWC.2025.3594547
M3 - Article
AN - SCOPUS:105012752412
SN - 1536-1276
JO - IEEE Transactions on Wireless Communications
JF - IEEE Transactions on Wireless Communications
ER -