TY - JOUR
T1 - A Multi-Modal Gait Analysis-Based Detection System of the Risk of Depression
AU - Shao, Wei
AU - You, Zhiyang
AU - Liang, Lesheng
AU - Hu, Xiping
AU - Li, Chengming
AU - Wang, Wei
AU - Hu, Bin
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2022/10/1
Y1 - 2022/10/1
N2 - Currently, depression has become a common mental disorder, especially among postgraduates. It is reported that postgraduates have a higher risk of depression than the general public, and they are more sensitive to contact with others. Thus, a non-contact and effective method for detecting people at risk of depression becomes an urgent demand. In order to make the recognition of depression more reliable and convenient, we propose a multi-modal gait analysis-based depression detection method that combines skeleton modality and silhouette modality. Firstly, we propose a skeleton feature set to describe depression and train a Long Short-Term Memory (LSTM) model to conduct sequence strategy. Secondly, we generate Gait Energy Image (GEI) as silhouette features from RGB videos, and design two Convolutional Neural Network (CNN) models with a new loss function to extract silhouette features from front and side perspectives. Then, we construct a multi-modal fusion model consisting of fusing silhouettes from the front and side views at the feature level and the classification results of different modalities at the decision level. The proposed multi-modal model achieved accuracy at 85.45% in the dataset consisting of 200 postgraduate students (including 86 depressive ones), 5.17% higher than the best single-mode model. The multi-modal method also shows improved generalization by reducing the gender differences. Furthermore, we design a vivid 3D visualization of the gait skeletons, and our results imply that gait is a potent biometric for depression detection.
AB - Currently, depression has become a common mental disorder, especially among postgraduates. It is reported that postgraduates have a higher risk of depression than the general public, and they are more sensitive to contact with others. Thus, a non-contact and effective method for detecting people at risk of depression becomes an urgent demand. In order to make the recognition of depression more reliable and convenient, we propose a multi-modal gait analysis-based depression detection method that combines skeleton modality and silhouette modality. Firstly, we propose a skeleton feature set to describe depression and train a Long Short-Term Memory (LSTM) model to conduct sequence strategy. Secondly, we generate Gait Energy Image (GEI) as silhouette features from RGB videos, and design two Convolutional Neural Network (CNN) models with a new loss function to extract silhouette features from front and side perspectives. Then, we construct a multi-modal fusion model consisting of fusing silhouettes from the front and side views at the feature level and the classification results of different modalities at the decision level. The proposed multi-modal model achieved accuracy at 85.45% in the dataset consisting of 200 postgraduate students (including 86 depressive ones), 5.17% higher than the best single-mode model. The multi-modal method also shows improved generalization by reducing the gender differences. Furthermore, we design a vivid 3D visualization of the gait skeletons, and our results imply that gait is a potent biometric for depression detection.
KW - Depression
KW - fusion model
KW - gait
KW - multi-modal
KW - skeleton
UR - http://www.scopus.com/inward/record.url?scp=85139572989&partnerID=8YFLogxK
U2 - 10.1109/JBHI.2021.3122299
DO - 10.1109/JBHI.2021.3122299
M3 - Article
C2 - 34699374
AN - SCOPUS:85139572989
SN - 2168-2194
VL - 26
SP - 4859
EP - 4868
JO - IEEE Journal of Biomedical and Health Informatics
JF - IEEE Journal of Biomedical and Health Informatics
IS - 10
ER -