Research article Special Issues

Dual-Uncertainty modeling in financial time-series via VMD-LSTM with concrete dropout and VMD-WGAN

  • Published: 16 December 2025
  • Financial time-series forecasting requires not only accurate point predictions but also a principled characterization of the uncertainty around future outcomes. We proposed a unified dual-path framework that modeled both forms using a shared variational mode decomposition (VMD) backbone. VMD stabilized nonstationary signals, enabling the predictive path—a long short-term memory (LSTM) network integrated with concrete dropout—to quantify epistemic uncertainty, while the generative path used a conditional Wasserstein generative adversarial network (WGAN) to capture aleatoric risk. Empirical evaluations on the Standard & Poor's 500 (S&P 500) index and Financial Times Stock Exchange 100 (FTSE 100) index demonstrated superior predictive accuracy and distributional fidelity over strong baselines. Comprehensive ablation studies and regime-conditioned analyses revealed a clear frequency-wise division of labor: Low-frequency modes drove predictive accuracy, while the generative path successfully reproduced heavy-tailed, regime-dependent return distributions. These findings underscored the efficacy of decomposed uncertainty modeling for robust financial risk assessment.

    Citation: Jeonggyu Huh, Dajin Kim, Minseok Jung, Seungwon Jeong. Dual-Uncertainty modeling in financial time-series via VMD-LSTM with concrete dropout and VMD-WGAN[J]. Networks and Heterogeneous Media, 2025, 20(5): 1411-1436. doi: 10.3934/nhm.2025061

    Related Papers:

  • Financial time-series forecasting requires not only accurate point predictions but also a principled characterization of the uncertainty around future outcomes. We proposed a unified dual-path framework that modeled both forms using a shared variational mode decomposition (VMD) backbone. VMD stabilized nonstationary signals, enabling the predictive path—a long short-term memory (LSTM) network integrated with concrete dropout—to quantify epistemic uncertainty, while the generative path used a conditional Wasserstein generative adversarial network (WGAN) to capture aleatoric risk. Empirical evaluations on the Standard & Poor's 500 (S&P 500) index and Financial Times Stock Exchange 100 (FTSE 100) index demonstrated superior predictive accuracy and distributional fidelity over strong baselines. Comprehensive ablation studies and regime-conditioned analyses revealed a clear frequency-wise division of labor: Low-frequency modes drove predictive accuracy, while the generative path successfully reproduced heavy-tailed, regime-dependent return distributions. These findings underscored the efficacy of decomposed uncertainty modeling for robust financial risk assessment.



    加载中


    [1] D. Avramov, Stock return predictability and model uncertainty, J. Financ. Econ., 64 (2002), 423–458. https://doi.org/10.1016/S0304-405X(02)00131-9 doi: 10.1016/S0304-405X(02)00131-9
    [2] C. Asare, D. Asante, J. F. Essel, Probabilistic LSTM modeling for stock price prediction with Monte Carlo dropout, Int. J. Innov. Sci. Res. Technol., 8 (2023), 2316–2323. https://doi.org/10.5281/zenodo.8224141 doi: 10.5281/zenodo.8224141
    [3] Y. Gal, Z. Ghahramani, Dropout as a Bayesian approximation: Representing model uncertainty in deep learning, preprint, arXiv: 1506.02142, 2016. https://doi.org/10.48550/arXiv.1506.02142
    [4] I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, Y. Bengio, et al. Generative adversarial networks, Adv. Neural Inf. Process. Syst., 63 (2014), 139–144. https://doi.org/10.1145/3422622 doi: 10.1145/3422622
    [5] M. Wiese, R. Knobloch, R. Korn, P. Kretschmer, Quant GANs: Deep generation of financial time series, Quant. Finance, 20 (2020), 1419–1440. https://doi.org/10.1080/14697688.2020.1730426 doi: 10.1080/14697688.2020.1730426
    [6] K. Dragomiretskiy, D. Zosso, Variational mode decomposition, IEEE Trans. Signal Process., 62 (2014), 531–544. https://doi.org/10.1109/TSP.2013.2288675 doi: 10.1109/TSP.2013.2288675
    [7] S. Hochreiter, J. Schmidhuber, Long short-term memory, Neural Comput., 9 (1997), 1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735 doi: 10.1162/neco.1997.9.8.1735
    [8] Y. Gal, J. Hron, A. Kendall, Concrete dropout, preprint, arXiv: 1705.07832, 2017. https://doi.org/10.48550/arXiv.1705.07832
    [9] M. Arjovsky, S. Chintala, L. Bottou, Wasserstein generative adversarial networks, in Proceedings of the 34th International Conference on Machine Learning, 70 (2017), 214–223. Available from: https://proceedings.mlr.press/v70/arjovsky17a.html.
    [10] I. Gulrajani, F. Ahmed, M. Arjovsky, V. Dumoulin, A. Courville, Improved training of Wasserstein GANs, preprint, arXiv: 1704.00028, 2017. https://doi.org/10.48550/arXiv.1704.00028
    [11] D. M. Q. Nelson, A. C. M. Pereira, R. A. de Oliveira, Stock market's price movement prediction with LSTM neural networks, in 2017 International Joint Conference on Neural Networks (IJCNN), Anchorage, AK, USA, 2017, 1419–1426. https://doi.org/10.1109/IJCNN.2017.7966019
    [12] T. Fischer, C. Krauss, Deep learning with long short-term memory networks for financial market predictions, Eur. J. Oper. Res., 270 (2018), 654–669. https://doi.org/10.1016/j.ejor.2017.11.054 doi: 10.1016/j.ejor.2017.11.054
    [13] X. Pang, Y. Zhou, P. Wang, W. Lin, V. Chang, An innovative neural network approach for stock market prediction, J. Supercomput., 76 (2020), 2098–2118. https://doi.org/10.1007/s11227-017-2228-y doi: 10.1007/s11227-017-2228-y
    [14] A. Moghar, M. Hamiche, Stock market prediction using LSTM recurrent neural network, Procedia Comput. Sci., 170 (2020), 1168–1173.
    [15] X. Liang, Z. Ge, L. Sun, M. He, H. Chen, LSTM with wavelet transform based data preprocessing for stock price prediction, Math. Probl. Eng., 2019 (2019), 1–8. https://doi.org/10.1155/2019/1340174 doi: 10.1155/2019/1340174
    [16] J. Cao, Z. Li, J. Li, Financial time series forecasting model based on CEEMDAN and LSTM, Physica A, 519 (2019), 127–139. https://doi.org/10.1016/j.physa.2018.11.061 doi: 10.1016/j.physa.2018.11.061
    [17] Y. Lin, Y. Yan, J. Xu, Y. Liao, F. Ma, Forecasting stock index price using the CEEMDAN-LSTM model, N. Am. J. Econ. Finance, 57 (2021), 101421. https://doi.org/10.1016/j.najef.2021.101421 doi: 10.1016/j.najef.2021.101421
    [18] M. H. Al-Badrawi, Statistical Properties and Applications of Empirical Mode Decomposition, Ph.D. thesis, University of New Hampshire, 2017.
    [19] M. Jabloun, Empirical mode decomposition revisited using ordinal pattern concepts, in 2022 30th European Signal Processing Conference (EUSIPCO), Belgrade, Serbia, 2022, 2186–2190. https://doi.org/10.23919/EUSIPCO55093.2022.9909668
    [20] B. Xu, Y. Sheng, P. Li, Q. Cheng, J. Wu, Causes and classification of EMD mode mixing, Vibroeng. Procedia, 22 (2019), 158–164. https://doi.org/10.21595/vp.2018.20250 doi: 10.21595/vp.2018.20250
    [21] H. Niu, K. Xu, W. Wang, A hybrid stock price index forecasting model based on variational mode decomposition and LSTM network, Appl. Intell., 50 (2020), 4296–4309. https://doi.org/10.1007/s10489-020-01814-0 doi: 10.1007/s10489-020-01814-0
    [22] H. Nasiri, M. M. Ebadzadeh, Multi-step-ahead stock price prediction using recurrent fuzzy neural network and variational mode decomposition, Appl. Soft Comput., 148 (2023), 110867. https://doi.org/10.1016/j.asoc.2023.110867 doi: 10.1016/j.asoc.2023.110867
    [23] Z. Zhang, Q. Liu, Y. Hu, H. Liu, Multi-feature stock price prediction by LSTM networks based on VMD and TMFG, J. Big Data, 12 (2025), 74. https://doi.org/10.1186/s40537-025-01127-4 doi: 10.1186/s40537-025-01127-4
    [24] W. C. Wang, B. Wang, K. W. Chau, D. M. Xu, Monthly runoff time series interval prediction based on WOA-VMD-LSTM using nonparametric kernel density estimation, Earth Sci. Inform., 16 (2023), 2373–2389. https://doi.org/10.1007/s12145-023-01038-z doi: 10.1007/s12145-023-01038-z
    [25] D. M. Xu, Z. Li, W. C. Wang, An ensemble model for monthly runoff prediction using least squares support vector machine based on variational modal decomposition with dung beetle optimization algorithm and error correction strategy, J. Hydrol., 629 (2024), 130558. https://doi.org/10.1016/j.jhydrol.2023.130558 doi: 10.1016/j.jhydrol.2023.130558
    [26] C. Serpell, I. A. Araya, C. Valle, H. Allende, Probabilistic forecasting using Monte Carlo dropout neural networks, in Lecture Notes in Computer Science, 11896 (2019), 387–397. https://doi.org/10.1007/978-3-030-33904-3_36
    [27] O. Mogren, C-RNN-GAN: Continuous recurrent neural networks with adversarial training, preprint, arXiv: 1611.09904.
    [28] C. Esteban, S. L. Hyland, G. Rätsch, Real-valued (medical) time series generation with recurrent conditional GANs, preprint, arXiv: 1706.02633.
    [29] J. Yoon, D. Jarrett, M. van der Schaar, Time-series generative adversarial networks, in Advances in Neural Information Processing Systems, 32 (2019).
    [30] N. Rahaman, A. Baratin, D. Arpit, F. Draxler, M. Lin, F. Hamprecht, et al. On the spectral bias of neural networks, preprint, arXiv: 1806.08734, 2019. https://doi.org/10.48550/arXiv.1806.08734
  • Reader Comments
  • © 2025 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(129) PDF downloads(9) Cited by(0)

Article outline

Figures and Tables

Figures(7)  /  Tables(6)

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog