Graph neural networks (GNNs) have been widely studied for handling graph-structured data. Neighborhood awareness, a mechanism for integrating context into graphs, plays a crucial role in learning node embeddings in GNNs. Existing methods typically perform neighborhood-aware steps only at the node or hop level, which limits their ability to learn edge-level semantic information. There are two main challenges in extending neighborhood awareness to the edge level: (1) designing a learning framework with message propagation of edge embeddings, which supports the construction of edge embeddings to capture pairwise node relationships and propagate messages to perceive local context, and (2) developing a fusion approach that bridges node and edge embedding spaces, thereby compressing edge-level structural information into node space to enhance the original neighborhood awareness. In this study, we propose an edge-level neighborhood awareness network (ELNA-Net) that integrates both node- and edge-level context for more comprehensive neighborhood awareness. Specifically, we use the Taylor interaction effect to construct explicit interactions between pairwise nodes, which approximates edge embeddings and captures intricate non-linear correlations. Furthermore, the adaptive edge-aware propagation module adopts a line graph to switch the roles of nodes and edges, revealing potential dependencies among neighboring edges and promoting neighborhood awareness. Finally, we propose a cross-mapping fusion approach to integrate node- and edge-level contexts within the graph convolution operator. Experimental results demonstrate that ELNA-Net achieves superior performance in semi-supervised node classification and link prediction, outperforming state-of-the-art models.
Citation: Yutong Guo, Wenrui Guan, Qihang Guo, Yuge Wang, Keyu Liu, Xibei Yang. Extending graph neural networks to edge-level neighborhood awareness via line graph[J]. Electronic Research Archive, 2026, 34(2): 866-889. doi: 10.3934/era.2026040
Graph neural networks (GNNs) have been widely studied for handling graph-structured data. Neighborhood awareness, a mechanism for integrating context into graphs, plays a crucial role in learning node embeddings in GNNs. Existing methods typically perform neighborhood-aware steps only at the node or hop level, which limits their ability to learn edge-level semantic information. There are two main challenges in extending neighborhood awareness to the edge level: (1) designing a learning framework with message propagation of edge embeddings, which supports the construction of edge embeddings to capture pairwise node relationships and propagate messages to perceive local context, and (2) developing a fusion approach that bridges node and edge embedding spaces, thereby compressing edge-level structural information into node space to enhance the original neighborhood awareness. In this study, we propose an edge-level neighborhood awareness network (ELNA-Net) that integrates both node- and edge-level context for more comprehensive neighborhood awareness. Specifically, we use the Taylor interaction effect to construct explicit interactions between pairwise nodes, which approximates edge embeddings and captures intricate non-linear correlations. Furthermore, the adaptive edge-aware propagation module adopts a line graph to switch the roles of nodes and edges, revealing potential dependencies among neighboring edges and promoting neighborhood awareness. Finally, we propose a cross-mapping fusion approach to integrate node- and edge-level contexts within the graph convolution operator. Experimental results demonstrate that ELNA-Net achieves superior performance in semi-supervised node classification and link prediction, outperforming state-of-the-art models.
| [1] |
S. Khoshraftar, A. An, A survey on graph representation learning methods, ACM Trans. Intell. Syst. Technol., 15 (2024), 1–55. https://doi.org/10.1145/3653986 doi: 10.1145/3653986
|
| [2] |
G. Xue, M. Zhong, T. Qian, J. Li, PSA-GNN: An augmented GNN framework with priori subgraph knowledge, Neural Networks, 173 (2024), 106155. https://doi.org/10.1016/j.neunet.2024.106155 doi: 10.1016/j.neunet.2024.106155
|
| [3] | W. L. Hamilton, R. Ying, J. Leskovec, Representation learning on graphs: Methods and applications, preprint, arXiv: 1709.05584. |
| [4] | M. Guang, C. Yan, Y. Xu, J. Wang, C. Jiang, Graph convolutional networks with adaptive neighborhood awareness, IEEE Trans. Pattern Anal. Mach. Intell., (2024), 7392–7404. https://doi.org/10.1109/TPAMI.2024.3391356 |
| [5] | F. Wu, A. Souza, T. Zhang, C. Fifty, T. Yu, K. Weinberger, Simplifying graph convolutional networks, preprint, arXiv: 1902.07153. |
| [6] |
L. Peng, R. Hu, F. Kong, J. Gan, Y. Mo, X. Shi, et al., Reverse graph learning for graph neural network, IEEE Trans. Neural Networks Learn. Syst., 35 (2024), 4530–4541. https://doi.org/10.1109/TNNLS.2022.3161030 doi: 10.1109/TNNLS.2022.3161030
|
| [7] |
X. Wang, X. Yang, P. Wang, H. Yu, T. Xu, SSGCN: A sampling sequential guided graph convolutional network, Int. J. Mach. Learn. Cybern., 15 (2024), 2023–2038. https://doi.org/10.1007/s13042-023-02013-2 doi: 10.1007/s13042-023-02013-2
|
| [8] |
H. Liu, B. Yang, D. Li, Graph collaborative filtering based on dual-message propagation mechanism, IEEE Trans. Cybern., 53 (2021), 352–364. https://doi.org/10.1109/TCYB.2021.3100521 doi: 10.1109/TCYB.2021.3100521
|
| [9] | T. N. Kipf, M. Welling, Semi-supervised classification with graph convolutional networks, preprint, arXiv: 1609.02907. |
| [10] | P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Liò, Y. Bengio, Graph attention networks, preprint, arXiv: 1710.10903. |
| [11] | H. Zeng, H. Zhou, A. Srivastava, R. Kannan, V. Prasanna, GraphSAINT: Graph sampling based inductive learning method, preprint, arXiv: 1907.04931. |
| [12] |
Y. Ye, S. Ji, Sparse graph attention networks, IEEE Trans. Knowl. Data Eng., 35 (2021), 905–916. https://doi.org/10.1109/TKDE.2021.3072345 doi: 10.1109/TKDE.2021.3072345
|
| [13] |
Q. Guo, X. Yang, F. Zhang, T. Xu, Perturbation-augmented graph convolutional networks: A graph contrastive learning architecture for effective node classification tasks, Eng. Appl. Artif. Intell., 129 (2024), 107616. https://doi.org/10.1016/j.engappai.2023.107616 doi: 10.1016/j.engappai.2023.107616
|
| [14] |
C. Huang, M. Li, F. Cao, H. Fujita, Z. Li, X. Wu, Are graph convolutional networks with random weights feasible?, IEEE Trans. Pattern Anal. Mach. Intell., 45 (2022), 2751–2768. https://doi.org/10.1109/TPAMI.2022.3183143 doi: 10.1109/TPAMI.2022.3183143
|
| [15] |
W. Guan, X. Yang, M. Li, Q. Guo, K. Liu, Q. Sun, VQIT-GNN: A collaborative knowledge transfer for node-level structure imbalance, Pattern Recognit., 172 (2026), 112632. https://doi.org/10.1016/j.patcog.2025.112632 doi: 10.1016/j.patcog.2025.112632
|
| [16] |
Y. Yang, Y. Sun, F. Ju, S. Wang, J. Gao, B. Yin, Multi-graph fusion graph convolutional networks with pseudo-label supervision, Neural Networks, 159 (2023), 305–317. https://doi.org/10.1016/j.neunet.2022.11.027 doi: 10.1016/j.neunet.2022.11.027
|
| [17] | Z. Chen, J. Bruna, L. Li, Supervised community detection with line graph neural networks, preprint, arXiv: 1705.08415. |
| [18] |
J. Wang, J. Liang, J. Cui, J. Liang, Semi-supervised learning with mixed-order graph convolutional networks, Inf. Sci., 573 (2021), 171–181. https://doi.org/10.1016/j.ins.2021.05.057 doi: 10.1016/j.ins.2021.05.057
|
| [19] |
F. Hu, Y. Zhu, S. Wu, W. Huang, L. Wang, T. Tan, Graphair: Graph representation learning with neighborhood aggregation and interaction, Pattern Recognit., 112 (2021), 107745. https://doi.org/10.1016/j.patcog.2020.107745 doi: 10.1016/j.patcog.2020.107745
|
| [20] |
B. Jiang, Y. Chen, B. Wang, H. Xu, B. Luo, DropAGG: Robust graph neural networks via drop aggregation, Neural Networks, 163 (2023), 65–74. https://doi.org/10.1016/j.neunet.2023.03.022 doi: 10.1016/j.neunet.2023.03.022
|
| [21] |
L. Cai, J. Li, J. Wang, S. Ji, Line graph neural networks for link prediction, IEEE Trans. Pattern Anal. Mach. Intell., 44 (2021), 5103–5113. https://doi.org/10.1109/TPAMI.2021.3080635 doi: 10.1109/TPAMI.2021.3080635
|
| [22] |
K. Yao, J. Liang, J. Liang, M. Li, F. Cao, Multi-view graph convolutional networks with attention mechanism, Artif. Intell., 307 (2022), 103708. https://doi.org/10.1016/j.artint.2022.103708 doi: 10.1016/j.artint.2022.103708
|
| [23] |
Q. Guo, X. Yang, W. Ding, Y. Qian, Cross-graph interaction networks, IEEE Trans. Knowl. Data Eng., 173 (2025), 11059. https://doi.org/10.1109/TKDE.2025.3543377 doi: 10.1109/TKDE.2025.3543377
|
| [24] |
Q. Guo, X. Yang, M. Li, Y. Qian, Collaborative graph neural networks for augmented graphs: A local-to-global perspective, Pattern Recognit., 158 (2025), 111020. https://doi.org/10.1016/j.patcog.2024.111020 doi: 10.1016/j.patcog.2024.111020
|
| [25] |
X. Wang, S. Trajanovski, R. E. Kooij, P. Van Mieghem, Degree distribution and assortativity in line graphs of complex networks, Physica A, 445 (2016), 343–356. https://doi.org/10.1016/j.physa.2015.10.109 doi: 10.1016/j.physa.2015.10.109
|
| [26] |
Z. Zhang, S. Sun, G. Ma, C. Zhong, Line graph contrastive learning for link prediction, Pattern Recognit., 140 (2023), 109537. https://doi.org/10.1016/j.patcog.2023.109537 doi: 10.1016/j.patcog.2023.109537
|
| [27] |
X. Jiang, R. Zhu, S. Li, P. Ji, Co-embedding of nodes and edges with graph neural networks, IEEE Trans. Pattern Anal. Mach. Intell., 45 (2020), 7075–7086. https://doi.org/10.1109/TPAMI.2020.3029762 doi: 10.1109/TPAMI.2020.3029762
|
| [28] | L. Gong, Q. Cheng, Exploiting edge features for graph neural networks, in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, (2019), 9211–9219. https://doi.org/10.1109/CVPR.2019.00943 |
| [29] | W. Song, C. Shi, Z. Xiao, Z. Duan, Y. Xu, M. Zhang, et al., AutoInt: Automatic feature interaction learning via self-attentive neural networks, in Proceedings of the 28th ACM International Conference on Information and Knowledge Management (CIKM), (2019), 1161–1170. https://doi.org/10.1145/3357384.3357925 |
| [30] | R. Wang, B. Fu, G. Fu, M. Wang, Deep & cross network for ad click predictions, in Proceedings of the ADKDD'17, (2017), 1–7. https://doi.org/10.1145/3124749.3124754 |
| [31] | S. Zhu, C. Zhou, S. Pan, X. Zhu, B. Wang, Relation structure-aware heterogeneous graph neural network, in Proceedings of the 23rd International Conference on Data Mining (ICDM), (2019), 1534–1539. https://doi.org/10.1109/ICDM.2019.00203 |
| [32] |
Z. Xie, W. Zhang, B. Sheng, P. Li, C. L. Philip Chen, BaGFN: Broad attentive graph fusion network for high-order feature interactions, IEEE Trans. Neural Networks Learn. Syst., 34 (2021), 4499–4513. https://doi.org/10.1109/TNNLS.2021.3116209 doi: 10.1109/TNNLS.2021.3116209
|
| [33] | S. Rendle, Factorization machines, in Proceedings of the 10th International Conference on Data Mining (ICDM), (2010), 995–1000. https://doi.org/10.1109/ICDM.2010.127 |
| [34] | W. Cheng, Y. Shen, L. Huang, Adaptive factorization network: Learning adaptive-order feature interactions, in Proceedings of the AAAI Conference on Artificial Intelligence, 34 (2020), 3609–3616. https://doi.org/10.1609/aaai.v34i04.5768 |
| [35] |
Z. Zhao, Z. Yang, C. Li, Q. Zeng, W. Guan, M. Zhou, Dual feature interaction-based graph convolutional network, IEEE Trans. Knowl. Data Eng., 35 (2022), 9019–9030. https://doi.org/10.1109/TKDE.2022.3220789 doi: 10.1109/TKDE.2022.3220789
|
| [36] |
H. Deng, N. Zou, M. Du, W. Chen, G. Feng, Z. Yang, et al., Unifying fourteen post-hoc attribution methods with Taylor interactions, IEEE Trans. Pattern Anal. Mach. Intell., 46 (2024), 4625–4640. https://doi.org/10.1109/TPAMI.2024.3358410 doi: 10.1109/TPAMI.2024.3358410
|
| [37] | T. N. Kipf, M. Welling, Variational graph auto-encoders, preprint, arXiv: 1611.07308. |
| [38] | A. Bojchevski, S. Günnemann, Deep Gaussian embedding of graphs: Unsupervised inductive learning via ranking, preprint, arXiv: 1707.03815. |
| [39] | X. Wang, H. Ji, C. Shi, B. Wang, Y. Ye, P. Cui, et al., Heterogeneous graph attention network, in Proceedings of The World Wide Web Conference (WWW), 2019. https://doi.org/10.1145/3308558.3313562 |
| [40] | H. Pei, B. Wei, K. Chang, Y. Lei, B. Yang, Geom-GCN: Geometric graph convolutional networks, preprint, arXiv: 2002.05287. |
| [41] | B. Perozzi, R. Al-Rfou, S. Skiena, DeepWalk: Online learning of social representations, in Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, (2014), 701–710. https://doi.org/10.1145/2623330.2623732 |
| [42] | J. Schulman, P. Moritz, S. Levine, M. I. Jordan, P. Abbeel, High-dimensional continuous control using generalized advantage estimation, preprint, arXiv: 1506.02438. |
| [43] | S. J. Ahn, M. Kim, Variational graph normalized autoencoders, in Proceedings of the 30th ACM International Conference on Information & Knowledge Management (CIKM), (2021), 2827–2831. https://doi.org/10.1145/3459637.3482215 |
| [44] |
C. R. Jack Jr, M. A. Bernstein, N. C. Fox, P. Thompson, G. Alexander, D. Harvey, et al., The Alzheimer's disease neuroimaging initiative (ADNI): MRI methods, J. Magn. Reson. Imaging, 27 (2008), 685–691. https://doi.org/10.1002/jmri.21049 doi: 10.1002/jmri.21049
|
| [45] | V. Sanjay, P. Swarnalatha, An enhanced approach for detecting Alzheimer's disease, in Proceedings of the 2022 Smart Technologies, Communication and Robotics (STCR), (2022), 1–5. https://doi.org/10.1109/STCR55312.2022.10009274 |
| [46] |
A. Kazi, L. Cosmo, S. A. Ahmadi, N. Navab, M. M. Bronstein, Differentiable graph module (DGM) for graph convolutional networks, IEEE Trans. Pattern Anal. Mach. Intell., 45 (2022), 1606–1617. https://doi.org/10.1109/TPAMI.2022.3170249 doi: 10.1109/TPAMI.2022.3170249
|
| [47] | F. Li, Z. Wang, Y. Guo, C. Liu, Y. Zhu, Y. Zhou, et al., Dynamic dual-graph fusion convolutional network for Alzheimer's disease diagnosis, in Proceedings of the 2023 International Conference on Image Processing (ICIP), (2023), 675–679. https://doi.org/10.1109/ICIP49359.2023.10222732 |