Paired Channel Enhanced Sign-Aware Graph Recommendation

Research output: Contribution to journalArticlepeer-review

Abstract

Graph-based recommendation systems have excelled in modeling user–item interactions, but most focus solely on positive feedback, overlooking the critical role of negative feedback in capturing comprehensive user preferences. Signed graphs, which incorporate both positive and negative interactions, offer a promising approach but face challenges due to the oversimplification of balance theory and the limitations of conventional graph neural networks (GNNs) in processing negative signals. To address these issues, we propose paired channel enhanced sign-aware graph recommendation (PCSRec), a novel framework that holistically integrates positive and negative feedback. PCSRec introduces a path-enhanced embedding module that leverages a learnable path-encoding matrix to capture indirect structural patterns, overcoming the limitations of balance theory. Additionally, it employs a paired channel filtering mechanism with spectral low-pass and high-pass filters to model similarity from positive feedback and dissimilarity from negative feedback, respectively. A dual-loss optimization strategy, combining contrastive and Bayesian personalized ranking (BPR) losses, further refines discriminative representations. Extensive experiments on four real-world datasets demonstrate that PCSRec outperforms both unsigned and signed graph-based baselines, achieving state-of-the-art recommendation performance. Ablation studies and visualizations confirm the effectiveness of its components in improving embedding quality and recommendation accuracy.

Original languageEnglish
JournalIEEE Transactions on Computational Social Systems
DOIs
Publication statusAccepted/In press - 2025
Externally publishedYes

Keywords

  • Negative feedback
  • positive feedback
  • signed graph recommendation

Fingerprint

Dive into the research topics of 'Paired Channel Enhanced Sign-Aware Graph Recommendation'. Together they form a unique fingerprint.

Cite this