Abstract
In recent years, depression has emerged as one of the leading causes of suicide. The widespread use of social media platforms and the anonymous characteristic of online communication provide individuals with depression a channel to express their emotions. This also presents a valuable opportunity to detect potential cases of depression promptly. However, current machine learning-based depression detection methods still suffer from limited performance and reliance on single-modal data. To address these limitations, we propose a multi-modal depression detection model (MDD) that leverages the multi-head attention mechanism to process features from text, images, and time data generated by users. The experimental results show that MDD has excellent performance in depression detection, achieving F1 score of 91.38%, precision of 90.86%, and recall of 91.90%. Our approach provides an effective method of user-level depression detection by considering multiple forms of user-generated content in social media and can better identify depressed patients.
| Original language | English |
|---|---|
| Journal | Pacific Asia Conference on Information Systems |
| Publication status | Published - 2025 |
| Event | 29th Pacific Asia Conference on Information Systems, PACIS 2025 - Kuala Lumpur, Malaysia Duration: 5 Jul 2025 → 9 Jul 2025 |
Keywords
- Depression detection
- Multi-head attention mechanism
- Multimodal data
- Social media
Fingerprint
Dive into the research topics of 'A Multimodal Approach to Depression Detection from Social Media Data'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver