
Graph Contrastive Learning (GCL) has been an emerging solution for graph self-supervised learning. Existing GCL methods always adopt the binary-contrastive setting: making binary decisions (positive/negative pairs) on the generated views, pulling positive (similar) pairs close and pushing negative (dissimilar) pairs far away. Despite promising performances, two critical issues arise: (i) the validity of view construction cannot be guaranteed: graph perturbation may produce invalid views against semantics and intrinsic topology of graph data; (ii) the binary-contrastive setting is unreliable: for non-euclidean graph data, positive (similar) pairs and negative (dissimilar) pairs are very difficult to be decided. The two problems further raise the research question: Is the binary-contrastive setting really necessary for graph contrastive learning? In this paper, we investigate this research question and introduce a novel GCL paradigm, namely Graph Soft-Contrastive Learning (GSCL), which can conduct graph self-supervise learning via neighborhood ranking without relying on the binary-contrastive setting. Specifically, GSCL is built upon the basic assumption of graph homophily that connected neighbors are more similar than far-distant nodes. Then, under the GSCL paradigm, we develop pair-wise and list-wise Gated Ranking infoNCE Loss functions to preserve the relative ranking relationship in the neighborhood. Moreover, as the neighborhood size exponentially expands with more hops considered, we propose neighborhood sampling strategies to improve learning efficiency. Extensive experimental results show that GSCL can achieve competitive or even superior performance compared with current state-of-the-art GCL methods. We expect our work can stimulate more research efforts to jump out of the traditional binary-contrastive setting and conform to the inherent characteristics and properties of graphs.
View on arXiv