Decoupled GNNs based on multi-view contrastive learning for scRNA-seq data clustering

Xiaoyan Yu, Yixuan Ren, Min Xia, Zhenqiu Shu*, Liehuang Zhu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Clustering is pivotal in deciphering cellular heterogeneity in single-cell RNA sequencing (scRNA-seq) data. However, it suffers from several challenges in handling the high dimensionality and complexity of scRNA-seq data. Especially when employing graph neural networks (GNNs) for cell clustering, the dependencies between cells expand exponentially with the number of layers. This results in high computational complexity, negatively impacting the model’s training efficiency. To address these challenges, we propose a novel approach, called decoupled GNNs, based on multi-view contrastive learning (scDeGNN), for scRNA-seq data clustering. Firstly, this method constructs two adjacency matrices to generate distinct views, and trains them using decoupled GNNs to derive the initial cell feature representations. These representations are then refined through a multilayer perceptron and a contrastive learning layer, ensuring the consistency and discriminability of the learned features. Finally, the learned representations are fused and applied to the cell clustering task. Extensive experimental results on nine real scRNA-seq datasets from various organisms and tissues show that the proposed scDeGNN method significantly outperforms other state-of-the-art scRNA-seq data clustering algorithms across multiple evaluation metrics.

Original languageEnglish
Article numberbbaf198
JournalBriefings in Bioinformatics
Volume26
Issue number3
DOIs
Publication statusPublished - 1 May 2025
Externally publishedYes

Keywords

  • clustering
  • contrastive learning
  • decoupled
  • GNNs
  • multi-view
  • scRNA-seq data

Fingerprint

Dive into the research topics of 'Decoupled GNNs based on multi-view contrastive learning for scRNA-seq data clustering'. Together they form a unique fingerprint.

Cite this