The increasing complexity of modern networks, from communication infrastructures to power grids and social networks, demands models that capture both structural dependencies and nonlinear dynamics of long memory. We proposed a hybrid framework that unified deep learning (graph neural networks, recurrent/attention modules) with fractional calculus to model nonlocal memory, anomalous diffusion, and self-similarity. Fractional differential formulations provide a principled description of network evolution, for which we stated a checkable stability condition; the learning pipeline coupled gradient-based training with fractional operators for robust, interpretable prediction. On Los Angeles metropolitan area traffic (METR-LA) dataset, the proposed ensemble integrated deep fractional model (EIDFM) achieved mean absolute error (MAE) around 6.4 and root mean square error (RMSE) 10.8, which showed improvement over the strongest baseline hybrid (CNN-LSTM): MAE $ 7.8 $, RMSE $ 12.5 $) by 18% and 14%, respectively; mean absolute percentage error (MAPE) droped from $ 6.3\% $ to $ 5.2\% $ ($ \approx17\% $ relative), while $ R^2 $ rose from $ 0.91 $ to $ 0.94 $. Results were reported as mean$ \pm $std over five seeds with paired significance tests. A lightweight efficiency analysis showed modest overhead relative to the baseline (inference $ 3.2\pm0.3 $ ms/step vs. $ 2.8\pm0.2 $; parameters $ 12.8 $M vs. $ 11.3 $M), justified by the accuracy gains. These findings indicated that integrating fractional operators with graph-based deep learners yielded a mathematically grounded and practically effective paradigm for understanding and managing complex network dynamics.
Citation: Muhammad Ayaz, Tariq Ali, Mohammad Hijji, Imran Baig, Tareq Alhmiedat, El-Hadi M. Aggoune. Predictive modeling of complex networks using deep learning and fractional dynamics[J]. AIMS Mathematics, 2025, 10(11): 26717-26743. doi: 10.3934/math.20251175
The increasing complexity of modern networks, from communication infrastructures to power grids and social networks, demands models that capture both structural dependencies and nonlinear dynamics of long memory. We proposed a hybrid framework that unified deep learning (graph neural networks, recurrent/attention modules) with fractional calculus to model nonlocal memory, anomalous diffusion, and self-similarity. Fractional differential formulations provide a principled description of network evolution, for which we stated a checkable stability condition; the learning pipeline coupled gradient-based training with fractional operators for robust, interpretable prediction. On Los Angeles metropolitan area traffic (METR-LA) dataset, the proposed ensemble integrated deep fractional model (EIDFM) achieved mean absolute error (MAE) around 6.4 and root mean square error (RMSE) 10.8, which showed improvement over the strongest baseline hybrid (CNN-LSTM): MAE $ 7.8 $, RMSE $ 12.5 $) by 18% and 14%, respectively; mean absolute percentage error (MAPE) droped from $ 6.3\% $ to $ 5.2\% $ ($ \approx17\% $ relative), while $ R^2 $ rose from $ 0.91 $ to $ 0.94 $. Results were reported as mean$ \pm $std over five seeds with paired significance tests. A lightweight efficiency analysis showed modest overhead relative to the baseline (inference $ 3.2\pm0.3 $ ms/step vs. $ 2.8\pm0.2 $; parameters $ 12.8 $M vs. $ 11.3 $M), justified by the accuracy gains. These findings indicated that integrating fractional operators with graph-based deep learners yielded a mathematically grounded and practically effective paradigm for understanding and managing complex network dynamics.
| [1] | Q. Y. Kang, K. Zhao, Q. X. Ding, F. Ji, X. H. Li, W. F. Liang, et al., Unleashing the potential of fractional calculus in graph neural networks with FROND, 2024, arXiv: 2404.17099. |
| [2] |
K. Zhao, X. H. Li, Q. Y. Kang, F. Ji, Q. X. Ding, Y. N. Zhao, et al., Distributed-order fractional graph operating network, Adv. Neural Inform. Process. Syst., 37 (2024), 103442–103475. https://doi.org/10.52202/079017-3286 doi: 10.52202/079017-3286
|
| [3] |
İ. Avcı, H. Lort, B. E. Tatlıcıoğlu, Numerical investigation and deep learning approach for fractal-fractional order dynamics of Hopfield neural network model, Chaos Solitons Fract., 177 (2023), 114302. https://doi.org/10.1016/j.chaos.2023.114302 doi: 10.1016/j.chaos.2023.114302
|
| [4] |
X. N. Yu, H. Xu, Z. P. Mao, H. G. Sun, Y. Zhang, Z. B. Chen, et al., A data-driven framework for discovering fractional differential equations in complex systems, Nonlinear Dyn., 113 (2025), 24557–24577. https://doi.org/10.1007/s11071-025-11373-z doi: 10.1007/s11071-025-11373-z
|
| [5] | W. J. Cui, Q. Y. Kang, X. H. Li, K. Zhao, W. P. Tay, W. H. Deng, et al., Neural variable-order fractional differential equation networks, In: Proceedings of the AAAI Conference on Artificial Intelligence, 39 (2025), 16109–16117. https://doi.org/10.1609/aaai.v39i15.33769 |
| [6] | Q. Y. Kang, K. Zhao, Y. Song, Y. H. Xie, Y. Zhao, S. J. Wang, et al., Coupling graph neural networks with fractional order continuous dynamics: a robustness study, In: Proceedings of the AAAI Conference on Artificial Intelligence, 38 (2024), 13049–13058, https://doi.org/10.1609/aaai.v38i12.29203 |
| [7] |
S. M. Sivalingam, V. Govindaraj, Neural fractional order differential equations, Expert Syst. Appl., 267 (2025), 126041. https://doi.org/10.1016/j.eswa.2024.126041 doi: 10.1016/j.eswa.2024.126041
|
| [8] |
R. D. C. dos Santos, J. H. de Oliveira Sales, G. S. Santos, Symmetrized neural network operators in fractional calculus: Caputo derivatives, asymptotic analysis, and the Voronovskaya-Santos-Sales theorem, Axioms, 14 (2025), 1–59, https://doi.org/10.3390/axioms14070510 doi: 10.3390/axioms14070510
|
| [9] |
M. Vellappandi, S. Lee, Neural fractional differential networks for modeling complex dynamical systems, Nonlinear Dyn., 113 (2025), 12117–12130. https://doi.org/10.1007/s11071-024-10795-5 doi: 10.1007/s11071-024-10795-5
|
| [10] | V. Molek, Z. Alijani, Fractional concepts in neural networks: enhancing activation functions, Pattern Recogn. Lett., 190 (2025), 126–132. https://doi.org/10.1016/j.patrec.2025.02.013 |
| [11] | C. Coelho, M. F. P. Costa, L. L. Ferrás, Fractional calculus meets neural networks for computer vision: a survey, AI, 5 (2024), 1391–1426. https://doi.org/10.3390/ai5030067 |
| [12] |
M. Joshi, S. Bhosale, V. A. Vyawahare, A survey of fractional calculus applications in artificial neural networks, Artif. Intell. Rev., 56 (2023), 13897–13950.} https://doi.org/10.1007/s10462-023-10474-8 doi: 10.1007/s10462-023-10474-8
|
| [13] |
X. Liu, Q. S. Feng, S. Ullah, S. L. Zhang, Synchronization of fractional complex networks with unbounded coupling delays via adaptive control, Commun. Nonlinear Sci. Numer. Simul., 142 (2025), 108518.} https://doi.org/10.1016/j.cnsns.2024.108518 doi: 10.1016/j.cnsns.2024.108518
|
| [14] |
M. Maiti, S. Muthukumaran, A. Rammohan, K. Bingi, N. B. Shaik, W. Benjapolakul, Recent advances and applications of fractional-order neural networks, Eng. J., 26 (2022), 49–67. https://doi.org/10.4186/ej.2022.26.7.49 doi: 10.4186/ej.2022.26.7.49
|
| [15] |
X. Y. Li, Z. S. Cheng, Y. M. Xin, Y. Shang, Dynamic behavior of three-layer fractional-order neural networks with multiple delays, Cogn. Comput., 17 (2025), 48. https://doi.org/10.1007/s12559-025-10411-7 doi: 10.1007/s12559-025-10411-7
|
| [16] |
R. Hasani, M. Lechner, A. Amini, L. Liebenwein, A. Ray, M. Tschaikowski, et al., Closed-form continuous-time neural networks, Nat. Mach. Intell., 4 (2022), 992–1003. https://doi.org/10.1038/s42256-022-00556-7 doi: 10.1038/s42256-022-00556-7
|
| [17] | E. Rossi, B. Chamberlain, F. Frasca, D. Eynard, F. Monti, M. Bronstein, Temporal graph networks for deep learning on dynamic graphs, 2020, arXiv: 2006.10637. |
| [18] |
Z. H. Wu, S. R. Pan, F. W. Chen, G. D. Long, C. Q. Zhang, P. S. Yu, A comprehensive survey on graph neural networks, IEEE Trans. Neural Netw. Learn. Syst., 32 (2021), 4–24. https://doi.org/10.1109/TNNLS.2020.2978386 doi: 10.1109/TNNLS.2020.2978386
|
| [19] |
A. Divakaran, A. Mohan, Temporal link prediction: a survey, New Gener. Comput., 38 (2020), 213–258. https://doi.org/10.1007/s00354-019-00065-z doi: 10.1007/s00354-019-00065-z
|
| [20] |
D. Biligiri, J. A. Smitterberg, S. Akki, B. Goodwine, T. Chen, FractionalNet: a symmetric neural network to compute fractional-order derivatives, Nonlinear Dyn., 113 (2025), 33381–33398. https://doi.org/10.1007/s11071-025-11736-6 doi: 10.1007/s11071-025-11736-6
|
| [21] |
X. D. Hai, Y. G. Yu, C. H. Xu, G. J. Ren, Stability analysis of fractional differential equations with the short-term memory property, Fract. Calc. Appl. Anal., 25 (2022), 962–994. https://doi.org/10.1007/s13540-022-00049-9 doi: 10.1007/s13540-022-00049-9
|
| [22] |
D. A. Tedjopurnomo, Z. F. Bao, B. H. Zheng, F. M. Choudhury, A. K. Qin, A survey on modern deep neural network for traffic prediction: trends, methods and challenges, IEEE Trans. Knowl. Data Eng., 34 (2022), 1544–1561. https://doi.org/10.1109/TKDE.2020.3001195 doi: 10.1109/TKDE.2020.3001195
|
| [23] |
J. W. Yang, Z. G. Yu, L. Shi, A new fractional-order anomalous epidemic model on complex networks based on continuous-time random walk and its dynamics, Chaos, 35 (2025), 113112. https://doi.org/10.1063/5.0288024 doi: 10.1063/5.0288024
|
| [24] |
X. L. He, Y. Wang, T. Z. Li, R. Kang, Y. Zhao, Novel controller design for finite-time synchronization of fractional-order nonidentical complex dynamical networks under uncertain parameters, Fractal Fract., 8 (2024), 1–19. https://doi.org/10.3390/fractalfract8030155 doi: 10.3390/fractalfract8030155
|
| [25] |
J. A. H. Sanchez, K. Casilimas, O. M. C. Rendon, Deep reinforcement learning for resource management on network slicing: a survey, Sensors, 22 (2022), 1–32. https://doi.org/10.3390/s22083031 doi: 10.3390/s22083031
|
| [26] |
E. Viera-Martin, J. F. Gómez-Aguilar, J. E. Solís-Pérez, J. A. Hernández-Pérez, R. F. Escobar-Jiménez, Artificial neural networks: a practical review of applications involving fractional calculus, Eur. Phys. J. Spec. Top., 231 (2022), 2059–2095. https://doi.org/10.1140/epjs/s11734-022-00455-3 doi: 10.1140/epjs/s11734-022-00455-3
|
| [27] |
D. Baleanu, Y. Karaca, L. Vázquez, J. E. Macías-Díaz, Advanced fractional calculus, differential equations and neural networks: analysis, modeling and numerical computations, Phys. Scr., 98 (2023), 110201. https://doi.org/10.1088/1402-4896/acfe73 doi: 10.1088/1402-4896/acfe73
|
| [28] | Y. Li, M. Qu, J. Tang, Y. Chang, Signed Laplacian graph neural networks, In: Proceedings of the AAAI Conference on Artificial Intelligence, 37 (2023), 4444–4452. https://doi.org/10.1609/aaai.v37i4.25565 |
| [29] |
N. Murcia-Sepúlveda, J. M. Cruz-Duarte, I. Martin-Diaz, A. Garcia-Perez, J. J. Rosales-García, J. G. Avina-Cervantes, et al., Fractional calculus-based processing for feature extraction in harmonic-polluted fault monitoring systems, Energies, 12 (2019), 1–14. https://doi.org/10.3390/en12193736 doi: 10.3390/en12193736
|
| [30] |
L. Waikhom, R. Patgiri, A survey of graph neural networks in various learning paradigms: methods, applications, and challenges, Artif. Intell. Rev., 56 (2023), 6295–6364. https://doi.org/10.1007/s10462-022-10321-2 doi: 10.1007/s10462-022-10321-2
|
| [31] | K. Tiwari, N. M. Krishnan, P. A P, Cono: complex neural operator for continuous dynamical systems, 2023, arXiv: 2310.02094. |
| [32] |
H. Wang, Y. G Yu, G. G. Wen, Stability analysis of fractional-order Hopfield neural networks with time delays, Neural Netw., 55 (2014), 98–109. https://doi.org/10.1016/j.neunet.2014.03.012 doi: 10.1016/j.neunet.2014.03.012
|
| [33] |
L. Anghinoni, L. Zhao, D. H. Ji, H. Pan, Time series trend detection and forecasting using complex network topology analysis, Neural Netw., 117 (2019), 295–306. https://doi.org/10.1016/j.neunet.2019.05.018 doi: 10.1016/j.neunet.2019.05.018
|
| [34] |
L. T. Mou, P. F. Zhao, H. T. Xie, Y. Y. Chen, T-LSTM: a long short-term memory neural network enhanced by temporal information for traffic flow prediction, IEEE Access, 7 (2019), 98053–98060. https://doi.org/10.1109/ACCESS.2019.2929692 doi: 10.1109/ACCESS.2019.2929692
|
| [35] | Q. K. Meng, F. L. Wang, L. Zhao, Incremental policy iteration for unknown nonlinear systems with stability and performance guarantees, 2025, arXiv: 2508.21367. |
| [36] |
X. Wang, J. Sun, G. Wang, F. Allgöwer, J. Chen, Data-driven control of distributed event-triggered network systems, IEEE/CAA J. Automat. Sinica, 10 (2023), 351–364. https://doi.org/10.1109/JAS.2023.123225 doi: 10.1109/JAS.2023.123225
|
| [37] |
R. R. Hossain, R. Adesunkanmi, R. Kumar, Data-driven linear Koopman embedding for networked systems: model-predictive grid control, IEEE Syst. J., 17 (2023), 4809–4820. https://doi.org/10.1109/JSYST.2023.3253041 doi: 10.1109/JSYST.2023.3253041
|
| [38] |
M. Rotulo, C. De Persis, P. Tesi, Online learning of data-driven controllers for unknown switched linear systems, Automatica, 145 (2022), 110519. https://doi.org/10.1016/j.automatica.2022.110519 doi: 10.1016/j.automatica.2022.110519
|
| [39] |
C. De Persis, M. Rotulo, P. Tesi, Learning controllers from data via approximate nonlinearity cancellation, IEEE Trans. Automat. Control, 68 (2023), 6082–6097. https://doi.org/10.1109/TAC.2023.3234889 doi: 10.1109/TAC.2023.3234889
|
| [40] |
G. Z. Xie, H. Q. Jia, H. Li, Y. D. Zhong, W. L. Du, Y. Q. Dong, et al., A life prediction method of mechanical structures based on the phase field method and neural network, Appl. Math. Model., 119 (2023), 782–802. https://doi.org/10.1016/j.apm.2023.03.022 doi: 10.1016/j.apm.2023.03.022
|
| [41] |
A. Wadood, H. Albalawi, A. M. Alatwi, H. Anwar, T. Ali, Design of a novel fractional whale optimization-enhanced support vector regression (FWOA-SVR) model for accurate solar energy forecasting, Fractal Fract., 9 (2025), 1–24. https://doi.org/10.3390/fractalfract9010035 doi: 10.3390/fractalfract9010035
|