Graph contrastive learning methods have recently emerged as a promising solution to tackle the problem of label scarcity in real-world scenarios. However, most of the existing methods are still flawed due to the lack of guided objectives. Moreover, they fail to effectively utilize complementary structural information from different graphs. To address these limitations, we propose a novel semi-supervised teaching graph contrastive network (STGCN) for node classification. Based on the teaching network architecture, STGCN establishes multi-level contrastive objectives, ensuring rich and detailed supervision for graph encoders. Specifically, after carefully analyzing the intrinsic correlation between different augmented views, we send a diffusion graph and two augmented views together into the novel teaching network, which owns one teacher encoder to guide two shared student encoders. Furthermore, we introduce a random sampling mixing module that extracts complementary information from multiple graphs, along with a label propagation technique to fully exploit limited labeled data. Finally, our method incorporates supervised contrastive loss and node similarity regularization to ensure coherent alignment between labeled and unlabeled nodes. Extensive experiments on five real-world node classification datasets demonstrate a maximum of 2.60% higher improvement than other models.
Citation: Chenbin Shen, Jingjing Song, Qiguo Sun, Qihang Guo, Eric C.C. Tsang. Semi-supervised teaching graph contrastive network for node classification[J]. Electronic Research Archive, 2026, 34(3): 1626-1652. doi: 10.3934/era.2026074
Graph contrastive learning methods have recently emerged as a promising solution to tackle the problem of label scarcity in real-world scenarios. However, most of the existing methods are still flawed due to the lack of guided objectives. Moreover, they fail to effectively utilize complementary structural information from different graphs. To address these limitations, we propose a novel semi-supervised teaching graph contrastive network (STGCN) for node classification. Based on the teaching network architecture, STGCN establishes multi-level contrastive objectives, ensuring rich and detailed supervision for graph encoders. Specifically, after carefully analyzing the intrinsic correlation between different augmented views, we send a diffusion graph and two augmented views together into the novel teaching network, which owns one teacher encoder to guide two shared student encoders. Furthermore, we introduce a random sampling mixing module that extracts complementary information from multiple graphs, along with a label propagation technique to fully exploit limited labeled data. Finally, our method incorporates supervised contrastive loss and node similarity regularization to ensure coherent alignment between labeled and unlabeled nodes. Extensive experiments on five real-world node classification datasets demonstrate a maximum of 2.60% higher improvement than other models.
| [1] |
Y. Wang, X. Li, Y. Du, D. Huang, X. Chen, Z. Dong, et al., Graph convolutional network reconstruction with high-order node information for community detection, Eng. Appl. Artif. Intell., 163 (2026), 112957. https://doi.org/10.1016/j.engappai.2025.112957 doi: 10.1016/j.engappai.2025.112957
|
| [2] |
R. Cavoretto, C. Comoglio, A. De Rossi, Community detection methods for GBF-PUM signal approximation on graphs, Appl. Math. Comput., 510 (2026), 129702. https://doi.org/10.1016/j.amc.2025.129702 doi: 10.1016/j.amc.2025.129702
|
| [3] |
M. Teng, C. Gao, X. Li, Z. Wang, K. Fan, V. Nekorkin, Multi-scale graph contrastive learning for community detection in dynamic graphs, Inf. Process. Manag., 63 (2026), 104410. https://doi.org/10.1016/j.ipm.2025.104410 doi: 10.1016/j.ipm.2025.104410
|
| [4] |
M. A. S. Sejan, M. H. Rahman, M. A. Aziz, R. Tabassum, I. Hameed, N. Nasser, et al., Graph neural network enhanced internet of things node classification with different node connections, J. Network Comput. Appl., 244 (2025), 104363. https://doi.org/10.1016/j.jnca.2025.104363 doi: 10.1016/j.jnca.2025.104363
|
| [5] |
J. Wang, Q. Guan, L. Deng, M. Zhou, Z. Gong, GraphST: Class-imbalanced node classification with semantic relation transfer, Pattern Recognit., 172 (2026), 112626. https://doi.org/10.1016/j.patcog.2025.112626 doi: 10.1016/j.patcog.2025.112626
|
| [6] |
R. Song, P. Cao, G. Wen, L. Li, W. Liang, W. Li, et al., CGMAE: Self-supervised masked auto-encoder with cross-graph node alignment for node classification, Eng. Appl. Artif. Intell., 163 (2026), 112910. https://doi.org/10.1016/j.engappai.2025.112910 doi: 10.1016/j.engappai.2025.112910
|
| [7] | Y. Su, Y. Zhao, S. Erfani, J. Gan, R. Zhang, Detecting arbitrary order beneficial feature interactions for recommender systems, in Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, (2022), 1676–1686. https://doi.org/10.1145/3534678.3539238 |
| [8] | Y. Hou, S. Mu, W. X. Zhao, Y. Li, B. Ding, J. Wen, Towards universal sequence representation learning for recommender systems, in Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, (2022), 585–593. https://doi.org/10.1145/3534678.3539381 |
| [9] | Y. Ma, Y. He, A. Zhang, X. Wang, T. Chua, CrossCBR: Cross-view contrastive learning for bundle recommendation, in Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, (2022), 1233–1241. https://doi.org/10.1145/3534678.3539229 |
| [10] | S. Thakoor, C. Tallec, M. G. Azar, M. Azabou, E. L. Dyer, R. Munos, et al., Large-scale representation learning on graphs via bootstrapping, preprint, arXiv: 2102.06514. |
| [11] | J. Grill, F. Strub, F. Altché, C. Tallec, P. H. Richemond, E. Buchatskaya, et al., Bootstrap your own latent-a new approach to self-supervised learning, preprint, arXiv: 2006.07733. |
| [12] | J. Lee, Y. Oh, Y. In, N. Lee, D. Hyun, C. Park, GraFN: Semi-supervised node classification on graph with few labels via non-parametric distribution assignment, in Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, (2022), 2243–2248. https://doi.org/10.1145/3477495.3531838 |
| [13] |
M. Ghayekhloo, A. Nickabadi, Supervised contrastive learning for graph representation enhancement, Neurocomputing, 588 (2024), 127710. https://doi.org/10.1016/j.neucom.2024.127710 doi: 10.1016/j.neucom.2024.127710
|
| [14] | T. N. Kipf, M. Welling, Semi-supervised classification with graph convolutional networks, preprint, arXiv: 1609.02907. |
| [15] | P. Veličković, A. Casanova, P. Liò, G. Cucurull, A. Romero, Y. Bengio, Graph attention networks, in ICLR 2018 The Sixth International Conference on Learning Representations, (2018), 1–12. https://doi.org/10.17863/CAM.48429 |
| [16] |
J. Liu, G. P. Ong, X. Chen, GraphSAGE-based traffic speed forecasting for segment network with sparse data, IEEE Trans. Intell. Transp. Syst., 23 (2022), 1755–1766. https://doi.org/10.1109/TITS.2020.3026025 doi: 10.1109/TITS.2020.3026025
|
| [17] | S. Wan, S. Pan, J. Yang, C. Gong, Contrastive and generative graph convolutional networks for graph-based semi-supervised learning, in Proceedings of the AAAI Conference on Artificial Intelligence, 35 (2021), 10049–10057. https://doi.org/10.1609/aaai.v35i11.17206 |
| [18] | Q. Li, Z. Han, X. Wu, Deeper insights into graph convolutional networks for semi-supervised learning, in Proceedings of the AAAI Conference on Artificial Intelligence, 32 (2018), 3538–3545. https://doi.org/10.1609/aaai.v32i1.11604 |
| [19] | X. Wu, Z. Li, A. M. So, J. Wright, S. Chang, Learning with partially absorbing random walks, in Proceedings of the 26th International Conference on Neural Information Processing Systems, 2 (2012), 3077–3085. |
| [20] | K. Sun, Z. Lin, Z. Zhu, Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes, in Proceedings of the AAAI conference on artificial intelligence, 34 (2020), 5892–5899. https://doi.org/10.1609/aaai.v34i04.6048 |
| [21] |
M. Ghayekhloo, A. Nickabadi, CLP-GCN: Confidence and label propagation applied to graph convolutional networks, Appl. Soft Comput., 132 (2023), 109850. https://doi.org/10.1016/j.asoc.2022.109850 doi: 10.1016/j.asoc.2022.109850
|
| [22] | F. Hu, Y. Zhu, S. Wu, L. Wang, T. Tan, Hierarchical graph convolutional networks for semi-supervised node classification, in Proceedings of the 28th International Joint Conference on Artificial Intelligence, (2019), 4532–4539. |
| [23] | Y. Jiang, C. Huang, L. Huang, Adaptive graph contrastive learning for recommendation, in Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, (2023), 4252–4261. https://doi.org/10.1145/3580305.3599768 |
| [24] | S. Suresh, P. Li, C. Hao, J. Neville, Adversarial graph augmentation to improve graph contrastive learning, in 35th Conference on Neural Information Processing Systems (NeurIPS 2021), (2021), 1–14. |
| [25] | P. Veličković, W. Fedus, W. L. Hamilton, P. Liò, Y. Bengio, R. D. Hjelm, Deep graph infomax, preprint, arXiv: 1809.10341. |
| [26] | Y. You, T. Chen, Y. Sui, T. Chen, Z. Wang, Y. Shen, Graph contrastive learning with augmentations, in 34th Conference on Neural Information Processing Systems (NeurIPS 2020), (2020), 1–12. |
| [27] | K. Hassani, A. H. Khasahmadi, Contrastive multi-view representation learning on graphs, in Proceedings of the 37th International Conference on Machine Learning, (2020), 1–11. |
| [28] | J. Qiu, Q. Chen, Y. Dong, J. Zhang, H. Yang, M. Ding, et al., GCC: Graph contrastive coding for graph neural network pre-training, in Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, (2020), 1150–1160. https://doi.org/10.1145/3394486.3403168 |
| [29] |
Q. Li, W. Li, X. Zheng, J. Zhou, W. Zhong, X. Chen, et al., GRE2-MDC: Graph representation embedding enhanced via multidimensional contrastive learning, IEEE Access, 13 (2025), 61312–61321. https://doi.org/10.1109/ACCESS.2025.3553862 doi: 10.1109/ACCESS.2025.3553862
|
| [30] | Y. Zhu, Y. Xu, F. Yu, Q. Liu, S. Wu, L. Wang, Deep graph contrastive representation learning, preprint, arXiv: 2006.04131. |
| [31] |
L. Chen, G. Zhu, Self-supervised contrastive learning for itinerary recommendation, Expert Syst. Appl., 268 (2025), 126246. https://doi.org/10.1016/j.eswa.2024.126246 doi: 10.1016/j.eswa.2024.126246
|
| [32] |
L. Chen, G. Zhu, W. Liang, J. Cao, Y. Chen, Keywords-enhanced contrastive learning model for travel recommendation, Inf. Process. Manage., 61 (2024), 103874. https://doi.org/10.1016/j.ipm.2024.103874 doi: 10.1016/j.ipm.2024.103874
|
| [33] | Y. Zhu, Y. Xu, F. Yu, Q. Liu, S. Wu, L. Wang, Graph contrastive learning with adaptive augmentation, in Proceedings of the Web Conference 2021, (2021), 2069–2080. https://doi.org/10.1145/3442381.3449802 |
| [34] |
T. Wang, K. Zhang, Y. Zhang, W. Luo, B. Stenger, T. Lu, et al., LLDiffusion: Learning degradation representations in diffusion models for low-light image enhancement, Pattern Recognit., 166 (2025), 111628. https://doi.org/10.1016/j.patcog.2025.111628 doi: 10.1016/j.patcog.2025.111628
|
| [35] |
Y. Wen, T. Gao, Z. Li, J. Zhang, K. Zhang, T. Chen, All-in-one weather-degraded image restoration via adaptive degradation-aware self-prompting model, IEEE Trans. Multimedia, 27 (2025), 3343–3355. https://doi.org/10.1109/TMM.2025.3535316 doi: 10.1109/TMM.2025.3535316
|
| [36] |
K. Zhang, W. Ren, W. Luo, W. Lai, B. Stenger, M. Yang, et al., Deep image deblurring: A survey, Int. J. Comput. Vision, 130 (2022), 2103–2130. https://doi.org/10.1007/s11263-022-01633-5 doi: 10.1007/s11263-022-01633-5
|
| [37] | J. Gasteiger, A. Bojchevski, S. Günnemann, Predict then propagate: Graph neural networks meet personalized pagerank, preprint, arXiv: 1810.05997. |
| [38] | K. Xu, W. Hu, J. Leskovec, S. Jegelka, How powerful are graph neural networks? preprint, arXiv: 1810.00826. |
| [39] |
X. Yao, H. Zhu, M. Gu, Brain-inspired GCN:modularity-based siamese simple graph convolutional networks, Inf. Sci., 657 (2024), 119971. https://doi.org/10.1016/j.ins.2023.119971 doi: 10.1016/j.ins.2023.119971
|
| [40] | T. Chen, S. Kornblith, M. Norouzi, G. Hinton, A simple framework for contrastive learning of visual representations, in Proceedings of the 37th International Conference on Machine Learning, (2020), 1–11. |
| [41] | F. Wu, T. Zhang, A. H. de Souza, C. Fifty, T. Yu, K. Q. Weinberger, Simplifying graph convolutional networks, in Proceedings of the 36th International Conference on Machine Learning, (2019), 1–11. |
| [42] | W. Feng, J. Zhang, Y. Dong, Y. Han, H. Luan, Q. Xu, et al., Graph random neural networks for semi-supervised learning on graphs, in 34th Conference on Neural Information Processing Systems (NeurIPS 2020), (2020), 1–12. |
| [43] | S. Wan, Y. Zhan, L. Liu, B. Yu, S. Pan, C. Gong, Contrastive graph poisson networks: Semi-supervised learning with extremely limited labels, in 35th Conference on Neural Information Processing Systems (NeurIPS 2021), (2021), 1–12. |
| [44] |
Z. Wu, P. Zhou, G. Wen, X. Zhu, A pseudo-labeling approach based on knowledge distillation for graph few-shot learning, Inf. Process. Manage., 62 (2025), 104268, https://doi.org/10.1016/j.ipm.2025.104268 doi: 10.1016/j.ipm.2025.104268
|
| [45] |
P. Sen, G. Namata, M. Bilgic, L. Getoor, B. Galligher, T. Eliassi-Rad, Collective classification in network data, AI Mag., 29 (2008), 93–106. https://doi.org/10.1609/aimag.v29i3.2157 doi: 10.1609/aimag.v29i3.2157
|
| [46] | G. Namata, B. London, L. Getoor, B. Huang, Query-driven active surveying for collective classification, in Proceedings of the Workshop on Mining and Learn ing with Graphs (MLG-2012), (2012), 1–8. |
| [47] | O. Shchur, M. Mumme, A. Bojchevski, S. Günnemann, Pitfalls of graph neural network evaluation, preprint, arXiv: 1811.05868. |
| [48] | D. P. Kingma, J. Ba, Adam: A method for stochastic optimization, preprint, arXiv: 1412.6980. |
| [49] | Z. Peng, W. Huang, M. Luo, Q. Zheng, Y. Rong, T. Xu, et al., Graph representation learning via graphical mutual information maximization, in Proceedings of The Web Conference 2020, (2020), 259–270. https://doi.org/10.1145/3366423.3380112 |
| [50] | Y. Jiao, Y. Xiong, J. Zhang, Y. Zhang, T. Zhang, Y. Zhu, Sub-graph contrast for scalable self-supervised graph representation learning, in 2020 IEEE International Conference on Data Mining (ICDM), (2020), 222–231. https://doi.org/10.1109/ICDM50108.2020.00031 |
| [51] | H. Wang, J. Zhang, Q. Zhu, W. Huang, Augmentation-free graph contrastive learning with performance guarantee, preprint, arXiv: 2204.04874. |
| [52] | X. Shen, D. Sun, S. Pan, X. Zhou, L. T. Yang, Neighbor contrastive learning on learnable graph augmentation, in Proceedings of the 37th AAAI Conference on Artificial Intelligence, 37 (2023), 9782–9791. https://doi.org/10.1609/aaai.v37i8.26168 |
| [53] |
M. Peng, X. Juan, Z. Li, Label-guided graph contrastive learning for semi-supervised node classification, Expert Syst. Appl., 239 (2024), 122385. https://doi.org/10.1016/j.eswa.2023.122385 doi: 10.1016/j.eswa.2023.122385
|