Citation: | H. Wang, M. Chen, K. Yu, J. Liang, Z. Lv, and Z. Zhang, “KT-RC: Kernel time-delayed reservoir computing for time series prediction,” IEEE/CAA J. Autom. Sinica, 2025. doi: 10.1109/JAS.2024.124986 |
[1] |
T. D. Pham, K. Wardell, A. Eklund, and G. Salerud, “Classification of short time series in early Parkinson’s disease with deep learning of fuzzy recurrence plots,” IEEE/CAA J. Autom. Sinica, vol. 6, no. 6, pp. 1306–1317, Nov. 2019. doi: 10.1109/JAS.2019.1911774
|
[2] |
L. Chen, L. Wang, Z. Han, J. Zhao, and W. Wang, “Variational inference based kernel dynamic Bayesian networks for construction of prediction intervals for industrial time series with incomplete input,” IEEE/CAA J. Autom. Sinica, vol. 7, no. 5, pp. 1437–1445, Sep. 2019.
|
[3] |
M. Wang, H. Shi, and C. Wang, “Distributed cooperative learning for discrete-time strict-feedback multi agent systems over directed graphs,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 10, pp. 1831–1844, Oct. 2022. doi: 10.1109/JAS.2022.105542
|
[4] |
R. Lewis and G. C. Reinsel, “Prediction of multivariate time series by autoregressive model fitting,” J. Multivar. Anal, vol. 16, no. 3, pp. 393–411, Jun. 1985. doi: 10.1016/0047-259X(85)90027-2
|
[5] |
H. Yang, Z. Pan, Q. Tao, and J. Qiu, “Online learning for vector autoregressive moving-average time series prediction,” Neurocomputing, vol. 315, pp. 9–17, Sep. 2018. doi: 10.1016/j.neucom.2018.04.011
|
[6] |
C. N. Babu and B. E. Reddy, “A moving-average filter based hybrid ARIMA-ANN model for forecasting time series data,” Appl. Soft. Comput, vol. 23, pp. 27–38, Jun. , 2014.
|
[7] |
H. Jaeger and H. Haas, “Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication,” Science, vol. 304, no. 5667, pp. 78–80, Apr. 2004. doi: 10.1126/science.1091277
|
[8] |
J. Liu, S. Li, and R. Liu, “Recurrent neural network inspired finite-time control design,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 6, pp. 1527–1529, Jun. 2024. doi: 10.1109/JAS.2023.123297
|
[9] |
A. H. Ribeiro, K. Tiels, L. A. Aguirre, and T. Schon, “Beyond exploding and vanishing gradients: analyzing RNN training using attractors and smoothness,” in Proc. Int. Conf. Artificial Intelligence and Statistics, pp. 2370–2380, 2020.
|
[10] |
Q. Zhang, H. Qian, Y. Chen, and D. Lei, “A short-term traffic forecasting model based on echo state network optimized by improved fruit fly optimization algorithm,” Neurocomputing, vol. 416, pp. 117–124, Oct. 2020. doi: 10.1016/j.neucom.2019.02.062
|
[11] |
R. Gao, L. Du, O. Duru, and K. F. Yuen, “Time series forecasting based on echo state network and empirical wavelet transformation,” Appl. Soft. Comput, vol. 102, p. 107111, Oct. 2021.
|
[12] |
T. Kim and B. R. King, “Time series prediction using deep echo state networks,” Neural Comput. Appl, vol. 32, no. 23, pp. 17769–17787, Apr. 2020. doi: 10.1007/s00521-020-04948-x
|
[13] |
X. Chen, M. Liu, and S. Li, “Echo state network with probabilistic regularization for time series prediction,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 8, pp. 1743–1753, Aug. 2023. doi: 10.1109/JAS.2023.123489
|
[14] |
S. Basterrech and G. Rubino, “Evolutionary echo state network: A neuroevolutionary framework for time series prediction,” Appl. Soft. Comput, vol. 144, p. 110463, Jun. 2023. doi: 10.1016/j.asoc.2023.110463
|
[15] |
T. Carvalho, P. Paiva, M. Vellasco, J. F. Amaral, and P. H. Coelho, “A neuroevolutionary approach for system identification,” J. Control Autom. Electr. Syst, vol. 35, no. 1, pp. 64–73, Dec. 2024. doi: 10.1007/s40313-023-01061-x
|
[16] |
X. Wang, Y. Jin and K. Hao, “Evolving local plasticity rules for synergistic learning in echo state networks,” IEEE Trans. Neural Netw. Learn. Syst, vol. 31, no. 4, pp. 1363–1374, April. , 2020. doi: 10.1109/TNNLS.2019.2919903
|
[17] |
Z. Pu, C. Li, S. Zhang, and Y. Bai, “Fault diagnosis for wind turbine gearboxes by using deep enhanced fusion network,” IEEE Trans. Instrum. Meas, vol. 70, pp. 1–11, Sep. 2020.
|
[18] |
X. Li, F. Bi, L. Zhang, J. Lin, X. Bi, and X. Yang, “Rotating machinery faults detection method based on deep echo state network,” Appl. Soft. Comput, vol. 127, p. 109335, Jul. 2022. doi: 10.1016/j.asoc.2022.109335
|
[19] |
L. A. Thiede and U. Parlitz, “Gradient based hyperparameter optimization in echo state networks,” Neural Netw, vol. 115, pp. 23–29, Mar. 2019. doi: 10.1016/j.neunet.2019.02.001
|
[20] |
D. Li, M. Han, and J. Wang, “Chaotic time series prediction based on a novel robust echo state network,” IEEE Trans. Neural Netw. Learn. Syst, vol. 23, no. 5, pp. 787–799, Apr. 2012. doi: 10.1109/TNNLS.2012.2188414
|
[21] |
R. Hu, Z. R. Tang, X. Song, J. Luo, E. Q. Wu, and S. Chang, “Ensemble echo network with deep architecture for time-series modeling,” Neural Comput. Appl, vol. 33, pp. 4997–5010, Aug. 2021.
|
[22] |
Z. Wang, Y. R. Zeng, S. Wang, and L. Wang, “Optimizing echo state network with backtracking search optimization algorithm for time series forecasting,” Eng. Appl. Artif. Intell, vol. 81, pp. 117–132, Feb. 2019. doi: 10.1016/j.engappai.2019.02.009
|
[23] |
S. Zhong, X. Xie, L. Lin, and F. Wang, “Genetic algorithm optimized double-reservoir echo state network for multi-regime time series prediction,” Neurocomputing, vol. 238, pp. 191–204, Mar. 2017. doi: 10.1016/j.neucom.2017.01.053
|
[24] |
H. Wang and X. Yan, “Optimizing the echo state network with a binary particle swarm optimization algorithm,” Knowledge-Based Syst, vol. 86, pp. 182–193, Jul. 2015. doi: 10.1016/j.knosys.2015.06.003
|
[25] |
G. T. Ribeiro, J. G. Sauer, N. Fraccanabbia, V. C. Mariani, and L. dos Santos Coelho, “Bayesian optimized echo state network applied to short-term load forecasting,” Energies, vol. 13, no. 9, p. 2390, May. 2020.
|
[26] |
G. Tang, L. Yu, La ng, W. Lv, and Y. Sun, “Dual-kernel echo state network for nonlinear time series prediction,” P Romanian Acad A. Phys, vol. 24, no. 2, pp. 179–190, Jun. 2023.
|
[27] |
X. Chen, L. Jin, and S. Li, “An online learning strategy for echo state network,” IEEE Trans. Syst. Man. Cy. A, vol. 54, no. 1, pp. 644–655, Oct. , 2023.
|
[28] |
H. Wang, Y. Liu, D. Wang, Y. Luo, C. Tong, and Z. Lv, “Discriminative and regularized echo state network for time series classification,” Pattern Recogn, vol. 130, p. 108811, May , 2022.
|
[29] |
H. Wang, Y. Liu, P. Lu, Y. Luo, D. Wang, and X. Xu, “Echo state network with logistic mapping and bias dropout for time series prediction,” Neurocomputing, vol. 489, pp. 196–210, Mar. 2022. doi: 10.1016/j.neucom.2022.03.018
|
[30] |
M. Han and M. Xu, “Laplacian echo state network for multivariate time series prediction,” IEEE Trans. Neural Netw. Learn. Syst, vol. 29, no. 1, pp. 238–244, Dec. 2017.
|
[31] |
H. Wang, Y. Liu, D. Wang, Y. Luo, and J. Xin, “Multi-layer echo state network with nonlinear vector autoregression reservoir for time series prediction,” in Proc. Int. Conf. Neural Computing for Advanced Applications, pp. 489–502, 2022.
|
[32] |
N. Li, G. Lu, X. Li and Y. Yan, “Prediction of pollutant emissions of biomass flames through digital imaging, contourlet transform, and support vector regression modeling,” IEEE Trans. Instrum. Meas, vol. 64, no. 9, pp. 2409–2416, Sep. , 2015. doi: 10.1109/TIM.2015.2411999
|
[33] |
A. Zeng, H. Ho, and Y. Yu, “Prediction of building electricity usage using Gaussian process regression,” J. Build. Eng, vol. 28, p. 101054, Nov. 2020.
|
[34] |
C. A. Sims, “Macroeconomics and reality,” Econometrica, vol. 48, no. 1, pp. 1–48, Jul. 1980. doi: 10.2307/1912017
|
[35] |
E. Bollt, “On explaining the surprising success of reservoir computing forecaster of chaos? The universal machine learning dynamical system with contrast to VAR and DMD,” Chaos, vol. 31, no. 1, Jan. 2021.
|
[36] |
F. Takens, “Detecting strange attractors in turbulence,” in Proc. Dynamical Systems and Turbulence, pp. 366–381, 1981.
|
[37] |
S. Boyd and L. Chua, “Fading memory and the problem of approximating nonlinear operators with volterra series,” IEEE Trans. Circuits Syst, vol. 32, no. 11, pp. 1150–1161, Nov. 1985. doi: 10.1109/TCS.1985.1085649
|
[38] |
H. O. A. Wold, “On prediction in stationary time series,” Ann. Math. Statist, vol. 19, no. 4, pp. 558–567, Dec. 1948. doi: 10.1214/aoms/1177730151
|
[39] |
H. Xiong, M. Swamy, and M. O. Ahmad, “Optimizing the kernel in the empirical feature space,” IEEE Trans. Neural Netw, vol. 16, no. 2, pp. 460–474, Mar. 2005.
|
[40] |
B. Scholkopf, S. Mika, C. J. Burges, P. Knirsch, K. R. Muller, G. Ratsch, and A. J. Smola, “Input space versus feature space in kernel-based methods,” IEEE Trans. Neural Netw, vol. 10, no. 5, pp. 1000–1017, Sep. 1999.
|
[41] |
P. R. Vlachas, J. Pathak, B. R. Hunt, T. P. Sapsis, M. Girvan, E. Ott, and P. Koumoutsakos, “Backpropagation algorithms and reservoir computing in recurrent neural networks for the forecasting of complex spatiotemporal dynamics,” Neural Netw, vol. 126, pp. 191–217, Apr. 2020. doi: 10.1016/j.neunet.2020.02.016
|
[42] |
D. Stolfi, “Parking Birmingham,” UCI Machine Learning Repository, 2019, https://doi.org/10.24432/C51K5Z.
|
[43] |
P. Li, F. Zhang, L. Gao, Y. Liu, and X. Ren, “A novel model for chaotic complex time series with large of data forecasting,” Knowledge-Based Syst, vol. 222, p. 107009, Apr. 2021. doi: 10.1016/j.knosys.2021.107009
|
[44] |
N. Parvini, M. Abdollahi, S. Seifollahi, and D. Ahmadian, “Forecasting bitcoin returns with long short-term memory networks and wavelet decomposition: A comparison of several market determinants,” Appl. Soft. Comput, vol. 121, p. 108707, Apr. 2022. doi: 10.1016/j.asoc.2022.108707
|
[45] |
J. H. Curry, “A generalized Lorenz system,” Commun. Math. Phys, vol. 60, pp. 193–204, Oct. 1978. doi: 10.1007/BF01612888
|
[46] |
A. Salam and A. E. Hibaoui, “Comparison of machine learning algorithms for the power consumption prediction: -case study of Tetouan city,” in Proc. 6th Int. Renewable and Sustainable Energy Conf., pp. 1–5, 2018.
|
[47] |
H. Ma, T. Zhou, K. Aihara, and L. Chen, “Predicting time series from short-term high-dimensional data,” Int. J. Bifurcat Chaos, vol. 24, no. 12, p. 1430033, Jul. 2014. doi: 10.1142/S021812741430033X
|
[48] |
C. Chen, R. Li, L. Shu, Z. He, J. Wang, C. Zhang, H. Ma, K. Aihara, and L. Chen, “Predicting future dynamics from short-term time series using an anticipated learning machine,” Natl. Sci. Rev, vol. 7, no. 6, pp. 1079–1091, Feb. , 2020. doi: 10.1093/nsr/nwaa025
|
[49] |
Y. Gao, R. Wang, and E. Zhou, “Stock prediction based on optimized LSTM and GRU models,” Sci. Program, vol. 2021, pp. 1–8, Sep. 2021.
|
[50] |
B. Majhi, D. Naidu, A. P. Mishra, and S. C. Satapathy, “Improved prediction of daily pan evaporation using deep-LSTM model,” Neural Comput. Appl, vol. 32, pp. 7823–7838, Jun. 2020. doi: 10.1007/s00521-019-04127-7
|
[51] |
J. B. Butcher, D. Verstraeten, B. Schrauwen, C. R. Day, and P. W. Haycock, “Reservoir computing and extreme learning machines for nonlinear time-series data analysis,” Neural Netw, vol. 38, pp. 76–89, Feb. 2013. doi: 10.1016/j.neunet.2012.11.011
|
[52] |
C. Gallicchio and A. Micheli, “Architectural and markovian factors of echo state networks,” Neural Netw, vol. 24, no. 5, pp. 440–456, Jun. 2011. doi: 10.1016/j.neunet.2011.02.002
|
[53] |
X. Sun, T. Li, Q. Li, Y. Huang, and Y. Li, “Deep belief echo-state network and its application to time series prediction,” Knowledge-Based Syst, vol. 130, pp. 17–29, Aug. 2017.
|
[54] |
C. Gallicchio, A. Micheli, and L. Pedrelli, “Deep reservoir computing: A critical experimental analysis,” Neurocomputing, vol. 268, pp. 87–99, Dec. 2017. doi: 10.1016/j.neucom.2016.12.089
|
[55] |
H. Wang, Y. Zhang, J. Liang, and L. Liu, “DAFA-BiLSTM: Deep autoregression feature augmented bidirectional LSTM network for time series prediction,” Neural Netw, vol. 157, pp. 240–256, Jan. 2023. doi: 10.1016/j.neunet.2022.10.009
|
[56] |
A. Kraskov, H. Stögbauer, and P. Grassberger, “Estimating mutual information,” Phys Rev E Stat Nonlin Soft Matter Phys, vol. 69, no. 6, p. 066138, Jun. 2004. doi: 10.1103/PhysRevE.69.066138
|
[57] |
P. M. Addo, M. Billio, and D. Guegan, “Nonlinear dynamics and recurrence plots for detecting fnancial crisis,” N. Am. Econ. Financ, vol. 26, pp. 416–435, Dec. 2013.
|