Probability graph complementation contrastive learning

Neural Netw. 2024 Nov:179:106522. doi: 10.1016/j.neunet.2024.106522. Epub 2024 Jul 9.

Abstract

Graph Neural Network (GNN) has achieved remarkable progress in the field of graph representation learning. The most prominent characteristic, propagating features along the edges, degrades its performance in most heterophilic graphs. Certain researches make attempts to construct KNN graph to improve the graph homophily. However, there is no prior knowledge to choose proper K and they may suffer from the problem of Inconsistent Similarity Distribution (ISD). To accommodate this issue, we propose Probability Graph Complementation Contrastive Learning (PGCCL) which adaptively constructs the complementation graph. We employ Beta Mixture Model (BMM) to distinguish intra-class similarity and inter-class similarity. Based on the posterior probability, we construct Probability Complementation Graphs to form contrastive views. The contrastive learning prompts the model to preserve complementary information for each node from different views. By combining original graph embedding and complementary graph embedding, the final embedding is able to capture rich semantics in the finetuning stage. At last, comprehensive experimental results on 20 datasets including homophilic and heterophilic graphs firmly verify the effectiveness of our algorithm as well as the quality of probability complementation graph compared with other state-of-the-art methods.

Keywords: Beta mixture model; Expectation maximization algorithm; Graph complementation; Graph contrastive learning; Heterophily.

MeSH terms

  • Algorithms*
  • Humans
  • Machine Learning
  • Neural Networks, Computer*
  • Probability*