TY - GEN
T1 - Who Moved My Cheese? Human and Non-human Motion Recognition with WiFi
AU - Zhu, Guozhen
AU - Wu, Chenshu
AU - Zeng, Xiaolu
AU - Wang, Beibei
AU - Liu, K. J.Ray
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Recently, an extensive amount of research has focused on indoor intelligent perception applications and systems. However, the performance of these applications can be greatly impacted by the movement of non-human subjects, such as pets, robots, and electrical appliances, making them impractical for mass use. In this paper, we present the first system that passively and unobtrusively distinguishes between moving human and non-human subjects by a single pair of commodity WiFi transceivers, without requiring the subjects to wear any device or move in a restricted area. Our system can detect the moving subjects, extract physically and statistically explainable features of their motion, and distinguish non-human and human movements accordingly. Leveraging the state-of-the-art rich-scattering multi-path model, our system can differentiate human and non-human motion through the wall, even in complex environments. Built on environment-independent features, our system can be applied to new environments without further effort from users. We validate the performance with commodity WiFi in four different buildings on subjects including the pet, vacuum robot, human, and fan. The results show that our system achieves 97.7% recognition accuracy and a 95.7% true positive rate for non-human motion recognition. Furthermore, it achieves 95.2% accuracy for unseen environments without model tuning, demonstrating its accuracy and robustness for ubiquitous use.
AB - Recently, an extensive amount of research has focused on indoor intelligent perception applications and systems. However, the performance of these applications can be greatly impacted by the movement of non-human subjects, such as pets, robots, and electrical appliances, making them impractical for mass use. In this paper, we present the first system that passively and unobtrusively distinguishes between moving human and non-human subjects by a single pair of commodity WiFi transceivers, without requiring the subjects to wear any device or move in a restricted area. Our system can detect the moving subjects, extract physically and statistically explainable features of their motion, and distinguish non-human and human movements accordingly. Leveraging the state-of-the-art rich-scattering multi-path model, our system can differentiate human and non-human motion through the wall, even in complex environments. Built on environment-independent features, our system can be applied to new environments without further effort from users. We validate the performance with commodity WiFi in four different buildings on subjects including the pet, vacuum robot, human, and fan. The results show that our system achieves 97.7% recognition accuracy and a 95.7% true positive rate for non-human motion recognition. Furthermore, it achieves 95.2% accuracy for unseen environments without model tuning, demonstrating its accuracy and robustness for ubiquitous use.
KW - WiFi sensing
KW - motion recognition
KW - non-human motion identification
KW - pet recognition
UR - http://www.scopus.com/inward/record.url?scp=85146111134&partnerID=8YFLogxK
U2 - 10.1109/MASS56207.2022.00073
DO - 10.1109/MASS56207.2022.00073
M3 - Conference contribution
AN - SCOPUS:85146111134
T3 - Proceedings - 2022 IEEE 19th International Conference on Mobile Ad Hoc and Smart Systems, MASS 2022
SP - 476
EP - 484
BT - Proceedings - 2022 IEEE 19th International Conference on Mobile Ad Hoc and Smart Systems, MASS 2022
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 19th IEEE International Conference on Mobile Ad Hoc and Smart Systems, MASS 2022
Y2 - 20 October 2022 through 22 October 2022
ER -