Graph contrastive learning (GCL) has emerged as a powerful self-supervised paradigm for graph representation. A critical challenge in GCL arises from the conflict between the contrastive loss function and the message-passing mechanism of graph convolutional network encoders, as the message-passing mechanism pulls neighboring nodes close, while the contrastive loss function pushes negative nodes apart, especially neighboring nodes. Prevalent methods typically tackle the conflict by identifying conflicting node pairs and subsequently ignoring them during training. However, the identification of conflicting pairs in these methods may be one-sided and even incorrect, likely deleting valuable pairs which are inherently non-conflicting. To overcome the limitation, we have proposed conflict refined graph contrastive learning (CR-GCL), a novel framework that leverages pseudo-labels to prevent ignoring non-conflicting negative pairs. Technically, CR-GCL incorporates three key components: 1) a perturbation augmentation module that creates augmentation graphs and generates representations; 2) a conflict quantification module that estimates the gradient impact of negative pairs by tracking representation similarity changes; 3) a conflict refinement module that utilizes pseudo-labels to enhance the identification of conflicting negative pairs, which are then ignored during training. Extensive experiments on six benchmark datasets revealed that CR-GCL significantly outperforms state-of-the-art methods, delivering superior node classification accuracy with various label rates.
Citation: Geng Tang, Qiguo Sun, Fengjun Zhang, Keyu Liu, Xibei Yang. Refining conflict in graph contrastive learning via pseudo-labels[J]. Electronic Research Archive, 2025, 33(8): 5133-5157. doi: 10.3934/era.2025230
Graph contrastive learning (GCL) has emerged as a powerful self-supervised paradigm for graph representation. A critical challenge in GCL arises from the conflict between the contrastive loss function and the message-passing mechanism of graph convolutional network encoders, as the message-passing mechanism pulls neighboring nodes close, while the contrastive loss function pushes negative nodes apart, especially neighboring nodes. Prevalent methods typically tackle the conflict by identifying conflicting node pairs and subsequently ignoring them during training. However, the identification of conflicting pairs in these methods may be one-sided and even incorrect, likely deleting valuable pairs which are inherently non-conflicting. To overcome the limitation, we have proposed conflict refined graph contrastive learning (CR-GCL), a novel framework that leverages pseudo-labels to prevent ignoring non-conflicting negative pairs. Technically, CR-GCL incorporates three key components: 1) a perturbation augmentation module that creates augmentation graphs and generates representations; 2) a conflict quantification module that estimates the gradient impact of negative pairs by tracking representation similarity changes; 3) a conflict refinement module that utilizes pseudo-labels to enhance the identification of conflicting negative pairs, which are then ignored during training. Extensive experiments on six benchmark datasets revealed that CR-GCL significantly outperforms state-of-the-art methods, delivering superior node classification accuracy with various label rates.
| [1] | S. Wu, Y. Tang, Y. Zhu, L. Wang, X. Xie, T. Tan, Session-based recommendation with graph neural networks, in Proceedings of the AAAI Conference on Artificial Intelligence, 33 (2019), 346–353. https://doi.org/10.1609/aaai.v33i01.3301346 |
| [2] | Y. Zhang, Y. Xiong, Y. Ye, T. Liu, W. Wang, Y. Zhu, et al., SEAL: Learning heuristics for community detection with generative adversarial networks, in Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, (2020), 1103–1113. https://doi.org/10.1145/3394486.3403154 |
| [3] | S. Vlaic, T. Conrad, C. Tokarski-Schnelle, M. Gustafsson, U. Dahmen, R. Guthke, et al., ModuleDiscoverer: Identification of regulatory modules in protein-protein interaction networks, Sci. Rep., 8 (2018), 433. https://doi.org/10.1038/s41598-017-18370-2 |
| [4] | J. Yu, H. Yin, X. Xia, T. Chen, L. Cui, Q. V. H. Nguyen, Are graph augmentations necessary? simple graph contrastive learning for recommendation, in Proceedings of the 45th international ACM SIGIR conference on research and development in information retrieval, (2022), 1294–1303. https://doi.org/10.1145/3477495.3531937 |
| [5] | Y. Zhang, H. Zhu, Z. Song, P. Koniusz, I. King, Spectral feature augmentation for graph contrastive learning and beyond, in Proceedings of the AAAI Conference on Artificial Intelligence, 37 (2023), 11289–11297. https://doi.org/10.1080/14432471.2023.2265618 |
| [6] |
Z. Zhang, S. Sun, G. Ma, C. Zhong, Line graph contrastive learning for link prediction, Pattern Recognit., 140 (2023), 109537. https://doi.org/10.1016/j.patcog.2023.109537 doi: 10.1016/j.patcog.2023.109537
|
| [7] | Z. Tong, Y. Liang, H. Ding, Y. Dai, X. Li, C. Wang, Directed graph contrastive learning, Adv. Neural Inf. Process. Syst., 34 (2021), 19580–19593. |
| [8] | Y. You, T. Chen, Y. Sui, T. Chen, Z. Wang, Y. Shen, Graph contrastive learning with augmentations, Adv. Neural Inf. Process. Syst., 33 (2020), 5812–5823. |
| [9] |
J. Wang, J. Ren, Graphormer based contrastive learning for recommendation, Appl. Soft Comput., 159 (2024), 111626. https://doi.org/10.1016/j.asoc.2024.111626 doi: 10.1016/j.asoc.2024.111626
|
| [10] | D. He, J. Zhao, C. Huo, Y. Huang, Y. Huang, Z. Feng, A new mechanism for eliminating implicit conflict in graph contrastive learning, in Proceedings of the AAAI Conference on Artificial Intelligence, 38 (2024), 12340–12348. https://doi.org/10.1609/aaai.v38i11.29125 |
| [11] | C. Ji, Z. Huang, Q. Sun, H. Peng, X. Fu, Q. Li, et al., ReGCL: Rethinking message passing in graph contrastive learning, in Proceedings of the AAAI Conference on Artificial Intelligence, 38 (2024), 8544–8552. https://doi.org/10.1609/aaai.v38i8.28698 |
| [12] |
Q. Guo, X. Yang, F. Zhang, T. Xu, Perturbation-augmented graph convolutional networks: A graph contrastive learning architecture for effective node classification tasks, Eng. Appl. Artif. Intell., 129 (2024), 107616. https://doi.org/10.1016/j.engappai.2023.107616 doi: 10.1016/j.engappai.2023.107616
|
| [13] |
Q. Sun, X. Wei, X. Yang, Graphsage with deep reinforcement learning for financial portfolio optimization, Expert Syst. Appl., 238 (2024), 122027. https://doi.org/10.1016/j.eswa.2023.122027 doi: 10.1016/j.eswa.2023.122027
|
| [14] | S. Abu-El-Haija, A. Kapoor, B. Perozzi, J. Lee, N-GCN: Multi-scale graph convolution for semi-supervised node classification, in Proceedings of The 35th Uncertainty in Artificial Intelligence Conference, PMLR, 115 (2020), 841–851. |
| [15] | S. Abu-El-Haija, B. Perozzi, A. Kapoor, N. Alipourfard, K. Lerman, H. Harutyunyan, et al., Mixhop: Higher-order graph convolutional architectures via sparsified neighborhood mixing, in International Conference on Machine Learning, 97 (2019), 21–29. |
| [16] |
K. Yao, J. Liang, J. Liang, M. Li, F. Cao, Multi-view graph convolutional networks with attention mechanism, Artif. Intell., 307 (2022), 103708. https://doi.org/10.1016/j.artint.2022.103708 doi: 10.1016/j.artint.2022.103708
|
| [17] |
L. He, L. Bai, X. Yang, H. Du, J. Liang, High-order graph attention network, Inf. Sci., 630 (2023), 222–234. https://doi.org/10.1016/j.ins.2023.02.054 doi: 10.1016/j.ins.2023.02.054
|
| [18] | Y. Zhu, Y. Xu, F. Yu, Q. Liu, S. Wu, L. Wang, Graph contrastive learning with adaptive augmentation, in Proceedings of the Web Conference, (2021), 2069–2080. https://doi.org/10.1145/3442381.3449802 |
| [19] | J. Chen, T. Ma, C. Xiao, FastGCN: Fast learning with graph convolutional networks via importance sampling, in International Conference on Learning Representations, 2018. |
| [20] |
Q. Ni, W. Peng, Y. Zhu, R. Ye, Graph dropout self-learning hierarchical graph convolution network for traffic prediction, Eng. Appl. Artif. Intell., 123 (2023), 106460. https://doi.org/10.1016/j.engappai.2023.106460 doi: 10.1016/j.engappai.2023.106460
|
| [21] |
H. Cong, Q. Sun, X. Yang, K. Liu, Y. Qian, Enhancing graph convolutional networks with progressive granular ball sampling fusion: A novel approach to efficient and accurate GCN training, Inf. Sci., 676 (2024), 120831. https://doi.org/10.1016/j.ins.2024.120831 doi: 10.1016/j.ins.2024.120831
|
| [22] | X. Wang, M. Zhu, D. Bo, P. Cui, C. Shi, J. Pei, AM-GCN: Adaptive multi-channel graph convolutional networks, in Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, (2020), 1243–1253. https://doi.org/10.1145/3394486.3403177 |
| [23] |
J. Wang, J. Liang, J. Cui, J. Liang, Semi-supervised learning with mixed-order graph convolutional networks, Inf. Sci., 573 (2021), 171–181. https://doi.org/10.1016/j.ins.2021.05.057 doi: 10.1016/j.ins.2021.05.057
|
| [24] |
Q. Teng, X. Yang, Q. Sun, P. Wang, X. Wang, T. Xu, Sequential attention layer-wise fusion network for multi-view classification, Int. J. Mach. Learn. Cybern., 15 (2024), 5549–5561. https://doi.org/10.1007/s13042-024-02260-x doi: 10.1007/s13042-024-02260-x
|
| [25] |
Q. Guo, X. Yang, W. Ding, Y. Qian, Cross-graph interaction networks, IEEE Trans. Knowl. Data Eng., 37 (2025), 2341–2355. https://doi.org/10.1109/TKDE.2025.3543377 doi: 10.1109/TKDE.2025.3543377
|
| [26] | P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Liò, Y. Bengio, Graph attention networks, in International Conference on Learning Representations, 2018. |
| [27] |
Q. Guo, X. Yang, M. Li, Y. Qian, Collaborative graph neural networks for augmented graphs: A local-to-global perspective, Pattern Recognit., 158 (2025), 111020. https://doi.org/10.1016/j.patcog.2024.111020 doi: 10.1016/j.patcog.2024.111020
|
| [28] | J. Qiu, Q. Chen, Y. Dong, J. Zhang, H. Yang, M. Ding, et al., GCC: Graph contrastive coding for graph neural network pre-training, in Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, (2020), 1150–1160. https://doi.org/10.1145/3394486.3403168 |
| [29] | W. Ju, Y. Wang, Y. Qin, Z. Mao, Z. Xiao, J. Luo, et al., Towards graph contrastive learning: A survey and beyond, preprint, arXiv: 2405.11868. https://doi.org/10.48550/arXiv.2405.11868 |
| [30] | P. Veličković, W. Fedus, W. L. Hamilton, P. Liò, Y. Bengio, R. D. Hjelm, Deep graph infomax, preprint, arXiv: 1809.10341. https://doi.org/10.48550/arXiv.1809.10341 |
| [31] | Y. Zhu, Y. Xu, F. Yu, Q. Liu, S. Wu, L. Wang, Deep graph contrastive representation learning, preprint, arXiv: 2006.04131. https://doi.org/10.48550/arXiv.2006.04131 |
| [32] |
W. Ju, Y. Gu, Z. Mao, Z. Qiao, Y. Qin, X. Luo, et al., GPS: Graph contrastive learning via multi-scale augmented views from adversarial pooling, Sci. China Inf. Sci., 68 (2025), 112101. https://doi.org/10.1007/s11432-022-3952-3 doi: 10.1007/s11432-022-3952-3
|
| [33] | J. Xia, L. Wu, G. Wang, J. Chen, S. Z. Li, Progcl: Rethinking hard negative mining in graph contrastive learning, in International Conference on Machine Learning, (2022), 24332–24346. https://doi.org/10.48550/arXiv.2110.02027 |
| [34] | N. Liu, X. Wang, D. Bo, C. Shi, J. Pei, Revisiting graph contrastive learning from the perspective of graph spectrum, Adv. Neural Inf. Process. Syst., 35 (2022), 2972–2983. |
| [35] | X. Luo, W. Ju, M. Qu, C. Chen, M. Deng, X. S. Hua, et al., Dualgraph: Improving semi-supervised graph classification via dual contrastive learning, in 2022 IEEE 38th International Conference on Data Engineering (ICDE), IEEE, (2022), 699–712. https://doi.org/10.1109/ICDE53745.2022.00057 |
| [36] | Y. Gu, Z. Chen, Y. Qin, Z. Mao, Z. Xiao, W. Ju, et al., DEER: Distribution divergence-based graph contrast for partial label learning on graphs, IEEE Trans. Multimedia, 2024 (2024). https://doi.org/10.1109/TMM.2024.3408038 |
| [37] | W. Ju, Y. Qin, S. Yi, Z. Mao, K. Zheng, L. Liu, et al., Zero-shot node classification with graph contrastive embedding network, Trans. Mach. Learn. Res., 2023. |
| [38] | C. Huo, D. Jin, Y. Li, D. He, Y. B. Yang, L. Wu, T2-GNN: Graph neural networks for graphs with incomplete features and structure via teacher-student distillation, in Proceedings of the AAAI Conference on Artificial Intelligence, 37 (2023), 4339–4346. https://doi.org/10.1080/14432471.2023.2265618 |
| [39] |
Y. Yang, Y. Sun, F. Ju, S. Wang, J. Gao, B. Yin, Multi-graph fusion graph convolutional networks with pseudo-label supervision, Neural Networks, 158 (2023), 305–317. https://doi.org/10.1016/j.neunet.2022.11.027 doi: 10.1016/j.neunet.2022.11.027
|
| [40] | T. N. Kipf, M. Welling, Semi-supervised classification with graph convolutional networks, in International Conference on Learning Representations, 2017. |
| [41] | X. Wang, H. Ji, C. Shi, B. Wang, Y. Ye, P. Cui, et al., Heterogeneous graph attention network, in The World Wide Web Conference, (2019), 2022–2032. https://doi.org/10.1145/3308558.3313562 |
| [42] | H. Pei, B. Wei, K. C. C. Chang, Y. Lei, B. Yang, Geom-GCN: Geometric graph convolutional networks, in International Conference on Learning Representations, 2020. https://doi.org/10.48550/arXiv.2002.05287 |
| [43] | N. Hoang, T. Maehara, T. Murata, Revisiting graph neural networks: Graph filtering perspective, in International Conference on Pattern Recognition, (2021), 8376–8383. https://doi.org/10.1109/ICPR48806.2021.9412278 |
| [44] |
T. Jin, H. Dai, L. Cao, B. Zhang, F. Huang, Y. Gao, et al., Deepwalk-aware graph convolutional networks, Sci. China Inf. Sci., 65 (2022), 152104. https://doi.org/10.1007/s11432-020-3318-5 doi: 10.1007/s11432-020-3318-5
|
| [45] |
J. Ben, Q. Sun, K. Liu, X. Yang, F. Zhang, Multi-head multi-order graph attention networks, Appl. Intell., 54 (2024), 8092–8107. https://doi.org/10.1007/s10489-024-05601-z doi: 10.1007/s10489-024-05601-z
|
| [46] | S. Liu, R. Ying, H. Dong, L. Li, T. Xu, Y. Rong, et al., Local augmentation for graph neural networks, in Proceedings of the 39th International Conference on Machine Learning, 162 (2022), 14054–14072. |
| [47] |
X. Zhang, G. Ding, J. Li, W. Wang, Q. Wu, Deep learning empowered mac protocol identification with squeeze-and-excitation networks, IEEE Trans. Cognit. Commun. Networking, 8 (2021), 683–693. https://doi.org/10.1109/TCCN.2021.3126306 doi: 10.1109/TCCN.2021.3126306
|