A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 15.3, Top 1 (SCI Q1)
    CiteScore: 23.5, Top 2% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
X. Chen, Z. Su, L. Jin, and S. Li, “A correntropy-based echo state network with application to time series prediction,” IEEE/CAA J. Autom. Sinica, vol. 12, no. 2, pp. 1–11, Feb. 2025. doi: 10.1109/JAS.2024.124932
Citation: X. Chen, Z. Su, L. Jin, and S. Li, “A correntropy-based echo state network with application to time series prediction,” IEEE/CAA J. Autom. Sinica, vol. 12, no. 2, pp. 1–11, Feb. 2025. doi: 10.1109/JAS.2024.124932

A Correntropy-Based Echo State Network With Application to Time Series Prediction

doi: 10.1109/JAS.2024.124932
Funds:  This work was supported in part by the National Natural Science Foundation of China (62176109, 62476115), the Fundamental Research Funds for the Central Universities (lzujbky-2023-ey07, lzujbky-2023-it14), the Natural Science Foundation of Gansu Province (24JRRA488), and the Supercomputing Center of Lanzhou University
More Information
  • As a category of recurrent neural networks, echo state networks (ESNs) have been the topic of in-depth investigations and extensive applications in a diverse array of fields, with spectacular triumphs achieved. Nevertheless, the traditional ESN and the majority of its variants are devised in the light of the second-order statistical information of data (e.g., variance and covariance), while more information is neglected. In the context of information theoretic learning, correntropy demonstrates the capacity to grab more information from data. Therefore, under the guidelines of the maximum correntropy criterion, this paper proposes a correntropy-based echo state network (CESN) in which the first-order and higher-order information of data is captured, promoting robustness to noise. Furthermore, an incremental learning algorithm for the CESN is presented, which has the expertise to update the CESN when new data arrives, eliminating the need to retrain the network from scratch. Finally, experiments on benchmark problems and comparisons with existing works are provided to verify the effectiveness and superiority of the proposed CESN.

     

  • loading
  • Xiufang Chen and Zhenming Su contributed equally to this work.
  • [1]
    J. Liu, S. Li, and R. Liu, “Recurrent neural network inspired finite-time control design,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 6, pp. 1527–1529, Jun. 2024. doi: 10.1109/JAS.2023.123297
    [2]
    L. Jin, Z. Su, D. Fu, and X. Xiao, “Coevolutionary neural solution for nonconvex optimization with noise tolerance,” IEEE Trans. Neural Netw. Learn. Syst., DOI 10.1109/TNNLS.2023.3306374.
    [3]
    L. Hua, K. Shi, Z.-G. Wu, S. Han, and S. Zhong, “Sliding mode control for recurrent neural networks with time-varying delays and impulsive effects,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 5, pp. 1319–1321, May 2023. doi: 10.1109/JAS.2023.123372
    [4]
    L. Jin, L. Wei, and S. Li, “Gradient-based differential neural-solution to time-dependent nonlinear optimization,” IEEE Trans. Automat. Contr., vol. 68, no. 1, pp. 620–627, Jan. 2023. doi: 10.1109/TAC.2022.3144135
    [5]
    Z. Zeng, J. Wang, and X. Liao, “Global exponential stability of a general class of recurrent neural networks with time-varying delays,” IEEE Trans. Circuits Syst. I Regul. Pap., vol. 50, no. 10, pp. 1353–1358, Oct. 2003. doi: 10.1109/TCSI.2003.817760
    [6]
    A. Joshi, S. Capezza, A. Alhaji, and M.-Y. Chow, “Survey on AI and machine learning techniques for microgrid energy management systems,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 7, pp. 1513–1529, Jul. 2023. doi: 10.1109/JAS.2023.123657
    [7]
    M. Liu, L. Chen, X. Du, L. Jin, and M. Shang, “Activated gradients for deep neural networks,” IEEE Trans. Neural Netw. Learn. Syst., vol. 34, no. 4, pp. 2156–2168, Apr. 2023. doi: 10.1109/TNNLS.2021.3106044
    [8]
    H. Jaeger, “The ‘echo state’ approach to analysing and training recurrent neural networks-with an erratum note,” Bonn, Germany, German Nat. Res. Center Inf. Technol. GMD Tech. Rep., vol. 148, no. 34, p. 13, 2001.
    [9]
    H. Jaeger and H. Haas, “Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication,” Science, vol. 304, no. 5667, pp. 78–80, Apr. 2004. doi: 10.1126/science.1091277
    [10]
    S. Jere, R. Safavinejad, and L. Liu, “Theoretical foundation and design guideline for reservoir computing-based MIMO-OFDM symbol detection,” IEEE Trans. Commun., vol. 71, no. 9, pp. 5169–5181, Sep. 2023. doi: 10.1109/TCOMM.2023.3263874
    [11]
    F. De Vita, G. Nocera, D. Bruneo, and S. K. Das, “A novel echo state network autoencoder for anomaly detection in industrial IoT systems,” IEEE Trans. Ind. Inform., vol. 19, no. 8, pp. 8985–8994, Aug. 2023. doi: 10.1109/TII.2022.3224981
    [12]
    J. Yang, Z. Huang, S. Quan, Z. Cao, and Y. Zhang, “RANSACs for 3D rigid registration: A comparative evaluation,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 10, pp. 1861–1878, Oct. 2022. doi: 10.1109/JAS.2022.105500
    [13]
    D. Su, P. S. Stanimirović, L. B. Han, and L. Jin, “Neural dynamics for improving optimiser in deep learning with noise considered,” CAAI Trans. Intell. Technol., vol. 9, no. 3, pp. 722–737, 2024. doi: 10.1049/cit2.12263
    [14]
    L. Shen, J. Chen, Z. Zeng, J. Yang, and J. Jin, “A novel echo state network for multivariate and nonlinear time series prediction,” Appl. Soft Comput., vol. 62, pp. 524–535, Jan. 2018. doi: 10.1016/j.asoc.2017.10.038
    [15]
    D. Li, M. Han, and J. Wang, “Chaotic time series prediction based on a novel robust echo state network,” IEEE Trans. Neural Netw. Learn. Syst., vol. 23, no. 5, pp. 787–799, May 2012. doi: 10.1109/TNNLS.2012.2188414
    [16]
    M. Xu and M. Han, “Adaptive elastic echo state network for multivariate time series prediction,” IEEE Trans. Cybern., vol. 46, no. 10, pp. 2173–2183, Oct. 2016. doi: 10.1109/TCYB.2015.2467167
    [17]
    X. Dutoit, B. Schrauwen, J. Van Campenhout, D. Stroobandt, H. Van Brussel, and M. Nuttin, “Pruning and regularization in reservoir computing,” Neurocomputing, vol. 72, no. 7–9, pp. 1534–1546, Mar. 2009.
    [18]
    F. Li and Y. Li, “Robust echo state network with Cauchy loss function and hybrid regularization for noisy time series prediction,” Appl. Soft Comput., vol. 146, p. 110640, Oct. 2023. doi: 10.1016/j.asoc.2023.110640
    [19]
    C. Peng and J. Ma, “Domain adaptive semantic segmentation via entropy-ranking and uncertain learning-based self-training,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 8, pp. 1524–1527, Aug. 2022. doi: 10.1109/JAS.2022.105767
    [20]
    R. Izanloo, S. A. Fakoorian, H. S. Yazdi, and D. Simon, “Kalman filtering based on the maximum correntropy criterion in the presence of non-Gaussian noise,” in Proc. Annu. Conf. Inf. Syst. Sci., Princeton, NJ, USA, 2016, pp. 500–505.
    [21]
    B. Shen, X. Wang, and L. Zou, “Maximum correntropy Kalman filtering for non-Gaussian systems with state saturations and stochastic nonlinearities,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 5, pp. 1223–1233, May 2023. doi: 10.1109/JAS.2023.123195
    [22]
    C. L. P. Chen, and Z. Liu, “Broad learning system: An effective and efficient incremental learning system without the need for deep architecture,” IEEE Trans. Neural Netw. Learn. Syst., vol. 29, no. 1, pp. 10–24, Jan. , 2018. doi: 10.1109/TNNLS.2017.2716952
    [23]
    G. B. Huang, Q. Y. Zhu, and C. K. Siew, “Extreme learning machine: Theory and applications,” Neurocomputing, vol. 70, no. 1–3, pp. 489–501, Dec. 2006.
    [24]
    S. Li, Z. You, H. Guo, X. Luo, and Z.-Q. Zhao, “Inverse-free extreme learning machine with optimal information updating,” IEEE Trans. Cybern., vol. 46, no. 5, pp. 1229–1241, May 2016. doi: 10.1109/TCYB.2015.2434841
    [25]
    Y. Zheng, B. Chen, S. Wang, and W. Wang, “Broad learning system based on maximum correntropy criterion,” IEEE Trans. Neural Netw. Learn. Syst., vol. 32, no. 7, pp. 3083–3097, Jul. 2021. doi: 10.1109/TNNLS.2020.3009417
    [26]
    J. Cao, H. Dai, B. Lei, C. Yin, H. Zeng, and A. Kummert, “Maximum correntropy criterion-based hierarchical one-class classification,” IEEE Trans. Neural Netw. Learn. Syst., vol. 32, no. 8, pp. 3748–3754, Aug. 2021. doi: 10.1109/TNNLS.2020.3015356
    [27]
    Y. Guo, F. Wang, B. Chen, and J. Xin, “Robust echo state networks based on correntropy induced loss function,” Neurocomputing, vol. 267, pp. 295–303, Dec. 2017. doi: 10.1016/j.neucom.2017.05.087
    [28]
    C. Zhang, Y. Guo, F. Wang, and B. Chen, “Generalized maximum correntropy-based echo state network for robust nonlinear system identification,” in Proc. Int. Jt. Conf. Neural Networks, Rio de Janeiro, Brazil, 2018, pp. 1–6.
    [29]
    W. He, M. Liu, Y. Tang, Q. Liu, and Y. Wang, “Differentiable automatic data augmentation by proximal update for medical image segmentation,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 7, pp. 1315–1318, Jul. 2022. doi: 10.1109/JAS.2022.105701
    [30]
    X. Li, Y. Xu, N. Li, B. Yang, and Y. Lei, “Remaining useful life prediction with partial sensor malfunctions using deep adversarial networks,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 1, pp. 121–134, Jan. 2023. doi: 10.1109/JAS.2022.105935
    [31]
    M. Girolami, “Mercer kernel-based clustering in feature space,” IEEE Trans. Neural Netw., vol. 13, no. 3, pp. 780–784, May 2002. doi: 10.1109/TNN.2002.1000150
    [32]
    Z. Zeng, J. Wang, and X. Liao, “Stability analysis of delayed cellular neural networks described using cloning templates,” IEEE Trans. Circuits Syst. I Regul. Pap., vol. 51, no. 11, pp. 2313–2324, Nov. 2004. doi: 10.1109/TCSI.2004.836855
    [33]
    L. Jin, S. Liang, X. Luo, and M. Zhou, “Distributed and time-delayed k-winner-take-all network for competitive coordination of multiple robots,” IEEE Trans. Cybern., vol. 53, no. 1, pp. 641–652, Jan. , 2023. doi: 10.1109/TCYB.2022.3159367
    [34]
    M. Wang, Y. Xie, and S. Qin, “An adaptive memristor-programming neurodynamic approach to nonsmooth nonconvex optimization problems,” IEEE Trans. Syst. Man Cybern. Syst., vol. 53, no. 11, pp. 6874–6885, Nov. 2023. doi: 10.1109/TSMC.2023.3287237
    [35]
    R. P. Agarwal, M. Meehan, and D. O’ Regan, Fixed Point Theory and Applications. Cambridge, U.K.: Cambridge Univ. Press, 2001.
    [36]
    X.-T. Yuan and B.-G. Hu, “Robust feature extraction via information theoretic learning,” in Proc. 26th Annu. Int. Conf. Mach. Learn., Montreal, QC, Canada, 2009, pp. 1193–1200.
    [37]
    S. Boyd and L. Vandenberghe, Convex Optimization, Cambridge, U.K.: Cambridge Univ. Press, 2004.
    [38]
    X. Chen, M. Liu, and S. Li, “Echo state network with probabilistic regularization for time series prediction,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 8, pp. 1743–1753, Aug. 2023. doi: 10.1109/JAS.2023.123489
    [39]
    S. Park, K. M. Cohen, and O. Simeone, “Few-shot calibration of set predictors via meta-learned cross-validation-based conformal prediction,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 46, no. 1, pp. 280–291, Jan. 2024. doi: 10.1109/TPAMI.2023.3327300
    [40]
    M. Khalid and A. V. Savkin, “A method for short-term wind power prediction with multiple observation points,” IEEE Trans. Power Syst., vol. 27, no. 2, pp. 579–586, May 2012. doi: 10.1109/TPWRS.2011.2160295
    [41]
    Z. Shi and M. Han, “Support vector echo-state machine for chaotic time-series prediction,” IEEE Trans. Neural Netw., vol. 18, no. 2, pp. 359–372, Mar. 2007. doi: 10.1109/TNN.2006.885113
    [42]
    M. Xu, M. Han, C. L. P. Chen, and T. Qiu, “Recurrent broad learning systems for time series prediction,” IEEE Trans. Cybern., vol. 50, no. 4, pp. 1405–1417, Apr. 2020. doi: 10.1109/TCYB.2018.2863020
    [43]
    M. Milanese and C. Novara, “Set membership prediction of nonlinear time series,” IEEE Trans. Automat. Contr., vol. 50, no. 11, pp. 1655–1669, Nov. 2005. doi: 10.1109/TAC.2005.858693

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(8)  / Tables(4)

    Article Metrics

    Article views (20) PDF downloads(1) Cited by()

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return