Forecasting Chaotic Multivariate Time Series Based on DB-ICDNet

Authors

  • RuiXia Xie College of Physics and Electronic Engineering, Northwest Normal University, Lanzhou 730070, PR China Author
  • LiLi Wang College of Physics and Electronic Engineering, Northwest Normal University, Lanzhou 730070, PR China Author

DOI:

https://doi.org/10.63313/JCSFT.9050

Keywords:

Chaotic Time Series, Multi-Step Prediction, Attention Mechanism, Two-Branch Coding, Cooperative Decoding

Abstract

Chaotic time series widely exist in complex scenarios such as nonlinear dynamic systems, financial markets, and meteorological observations. Its strong nonlinearity, sensitivity to initial values, and accumulation of long-term prediction errors make it possible to achieve stable, high-precision multi-step forecasting under limited observation conditions. Prediction is extremely challenging. Traditional linear models are difficult to approach nonlinear dynamics, while conventional deep learning models are prone to phase drift and error amplification during long-step extrapolation. Therefore, this study focuses on the problem of multivariate multi-step forecasting, and proposes a lightweight in-depth forecasting framework DB-ICDNet. In the representation learning stage, the framework extracts long-range dependencies and local perturbations through LSTM and causally inflated TCN, and realizes controlled fusion of information in the near-term window based on the residual gated fusion mechanism. In the prediction stage, it designs a step-size conditional dual-memory cooperative decoder, adaptively retrieves long-term and short-term contexts according to different prediction steps, and gradually generates future trajectories with end observations as anchors through incremental cumulative regression heads, supplemented by consistency constraints to suppress error diffusion. Experiments on Lorenz and Chen theoretical data sets and HS300 real data show that this method has higher accuracy and stability in multi-step prediction, especially in long-step extrapolation, it can effectively maintain the trajectory shape and alleviate error accumulation, which verifies its effectiveness in meeting the challenge of chaotic sequence prediction.

References

[1] Mao Q, Zhang K, Yan W, et al. Forecasting the incidence of tuberculosis in China using the seasonal auto-regressive integrated moving average (SA-RIMA) model[J]. Journal of infection and public health, 2018, 11(5): 707-712.

[2] Cujia A, Agudelo D, Pacheco-Bustos C, et al. Forecast of PM10 timeseries data: A study case in Caribbean cities[J]. Atmospheric Pollution Research, 2019, 10(6): 2053-2062.

[3] Ahmed N K, Atiya A F, Gayar N E, et al. An empirical comparison of machine learning models for time series forecasting[J]. Econometric Reviews, 2010, 29(5-6): 594-621.

[4] Geraldi M S, Ghisi E. Short-term instead of long-term rainfall time series in rainwater harvesting simulation in houses: An assessment using Bayesian Network[J]. Resources, Conservation and Recycling, 2019, 144: 1-12.

[5] Avendao-Valencia L D, Chatzi E N. Modelling long-term vibration monitor-ing data with Gaussian Process time-series models[J]. IFAC-PapersOnLine, 2019, 52(28): 26-31.

[6] Cortes C, Vapnik V. Support-vector networks[J]. Machine learning, 1995, 20(3): 273-297.

[7] Kumar D, Meghwani S S, Thakur M. Proximal support vector machine ba-sed hybrid prediction models for trend forecasting in financial markets[J]. Journal of Computational Science, 2016, 17: 1-13.

[8] Baser F, Demirhan H. A fuzzy regression with support vector machine ap-proach to the estimation of horizontal global solar radiation[J].Energy, 2017, 123: 229-240.

[9] Ordóez C, Lasheras F S, Roca-Pardias J, et al. A hybrid ARIMA–SVM model for the study of the remaining useful life of aircraft engines[J]. Jou-rnal of Computational and Applied Mathematics, 2019, 346: 184-191.

[10] Bisoyi N, Gupta H, Padhy N P, et al. Prediction of daily sediment discha-rge using a back propagation neural network training algorithm: A case st-udy of the Narmada River, India[J]. International journal of sediment rese-arch, 2019, 34(2): 125-135.

[11] Sherstinsky A. Fundamentals of recurrent neural network (rnn) and long short-term memory (lstm) network[J]. arXiv preprint arXiv:1808.03314, 2018.

[12] Hochreiter S, Schmidhuber J. Long short-term memory[J]. Neural computat-ion, 1997, 9(8): 1735-1780.

[13] Freeman B S, Taylor G, Gharabaghi B, et al. Forecasting air quality time series using deep learning[J]. Journal of the Air & Waste Management As-sociation, 2018, 68(8): 866-886.

[14] Lipton Z C, Kale D C, Elkan C, et al. Learning to diagnose with LSTM recurrent neural networks[J]. arXiv preprint arXiv:1511.03677, 2015.

[15] Qing X, Niu Y. Hourly day-ahead solar irradiance prediction using weather forecasts by LSTM[J]. Energy, 2018, 148: 461-468.

[16] Karevan Z, Suykens J K. Transductive LSTM for time-series prediction: An application to weather forecasting[J]. Neural Networks, 2020.

[17] Song X, Liu Y, Xue L, et al. Time-series well performance prediction bas-ed on Long Short Term Memory (LSTM) neural network model[J]. Journal of Petroleum Science and Engineering, 2020, 186: 106682.

[18] Tian, Z. D. "Preliminary Research of Chaotic Characteristics and Prediction of Short-Term Wind Speed Time Series," International Journal of Bifurcat-ion and Chaos, 2020, 30,2050176.

Downloads

Published

2026-03-10

Issue

Section

Articles

How to Cite

Forecasting Chaotic Multivariate Time Series Based on DB-ICDNet. (2026). Journal of Computer Science and Frontier Technologies, 2(3), 86-100. https://doi.org/10.63313/JCSFT.9050