| Peer-Reviewed

Improvement of Echo State Network Generalization by Selective Ensemble Learning Based on BPSO

Received: 29 November 2016    Accepted:     Published: 1 December 2016
Views:       Downloads:
Abstract

The Echo State Network (ESN) is a novel and special type of recurrent neural network that has become increasingly popular in machine learning domains such as time series forecasting, data clustering, and nonlinear system identification. This network is characterized by large randomly constructed recurrent neural networks (RNN) called “reservoir”, in which the neurons are sparsely connected and the weights remain unchanged during training, leaving the simple training of the output layer. However, the reservoir is criticized for its randomness and instability because of the random initialization of the connectivity and weights. In this article, we introduced the selective ensemble learning based on BPSO to improve the generalization performance of ESN. Two widely studied tasks are used to prove the feasibility and priority of the selective ESN ensemble based on BPSO(SESNE-BPSO) model. And the results indicate that the SESNE-BPSO model performs better than the general ESN ensemble, the single standard ESN and several other improved ESN models.

Published in Automation, Control and Intelligent Systems (Volume 4, Issue 6)
DOI 10.11648/j.acis.20160406.11
Page(s) 84-88
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

Echo State Network, Reservoir Computing, Artificial Neural Network, Ensemble Learning, Selective Ensemble, Particle Swarm Optimization

References
[1] B. Schrauwen, D. Verstraeten, J. Van Campenhout, An overview of reservoir computing: theory, applications and implementations, Proceedings of the 15th European Symposium on Artificial Neural Networks. p. 471-482 20072007), pp. 471-482.
[2] M. LukošEvičIus, H. Jaeger, Reservoir computing approaches to recurrent neural network training, Computer Science Review, 3 (2009) 127-149.
[3] H. Jaeger, Tutorial on training recurrent neural networks, covering BPPT, RTRL, EKF and the" echo state network" approach (GMD-Forschungszentrum Informationstechnik, 2002).
[4] H. Jaeger, Reservoir riddles: Suggestions for echo state network research, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005., (IEEE2005), pp. 1460-1462.
[5] W. Maass, Liquid state machines: motivation, theory, and applications, Computability in context: computation and logic in the real world, (2010) 275-296.
[6] J. Schmidhuber, D. Wierstra, M. Gagliolo, F. Gomez, Training recurrent networks by evolino, Neural computation, 19 (2007) 757-779.
[7] H. Jaeger, Adaptive nonlinear system identification with echo state networks, Advances in neural information processing systems2002), pp. 593-600.
[8] H. Jaeger, H. Haas, Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication, Science, 304 (2004) 78-80.
[9] S.-X. Lun, X.-S. Yao, H.-Y. Qi, H.-F. Hu, A novel model of leaky integrator echo state network for time-series prediction, Neurocomputing, 159 (2015) 58-66.
[10] C. Zhang, Y. Ma, Ensemble machine learning (Springer, 2012).
[11] D. West, S. Dellana, J. Qian, Neural network ensemble strategies for financial decision applications, Computers & operations research, 32 (2005) 2543-2559.
[12] L. K. Hansen, P. Salamon, Neural network ensembles, IEEE transactions on pattern analysis and machine intelligence, 12 (1990) 993-1001.
[13] Z.-H. Zhou, Ensemble learning, Encyclopedia of Biometrics, (2015) 411-416.
[14] L. K. Hansen, P. Salamon, Neural network ensembles, IEEE Transactions on Pattern Analysis & Machine Intelligence, (1990) 993-1001.
[15] G. Valentini, T. G. Dietterich, Bias—Variance Analysis and Ensembles of SVM, International Workshop on Multiple Classifier Systems, (Springer2002), pp. 222-231.
[16] L. I. Kuncheva, C. J. Whitaker, Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy, Machine learning, 51 (2003) 181-207.
[17] Z.-H. Zhou, J. Wu, W. Tang, Ensembling neural networks: many could be better than all, Artificial intelligence, 137 (2002) 239-263.
[18] L. Davis, Handbook of genetic algorithms, (1991).
[19] J. Kennedy, Particle swarm optimization, Encyclopedia of machine learning, (Springer, 2011), pp. 760-766.
[20] J. Kennedy, R. C. Eberhart, A discrete binary version of the particle swarm algorithm, Systems, Man, and Cybernetics, 1997. Computational Cybernetics and Simulation., 1997 IEEE International Conference on, (IEEE1997), pp. 4104-4108.
[21] H. Jaeger, The “echo state” approach to analysing and training recurrent neural networks-with an erratum note, Bonn, Germany: German National Research Center for Information Technology GMD Technical Report, 148 (2001) 34.
[22] R. C. Eberhart, J. Kennedy, A new optimizer using particle swarm theory, Proceedings of the sixth international symposium on micro machine and human science, (New York, NY1995), pp. 39-43.
[23] A. Weigend, N. Gershenfeld, Time Series Prediction: Forecasting the Future and Understanding the Past. 1994, Proceedings of a NATO Advanced Research Workshop on Comparative Time Series Analysis, held in Santa Fe, New Mexico).
[24] S. Basterrech, An Empirical Study of the L2-Boost technique with Echo State Networks, arXiv preprint arXiv:1501.00503, (2015).
[25] Z. Deng, Y. Zhang, Collective behavior of a small-world recurrent neural system with scale-free distribution, IEEE Transactions on Neural Networks, 18 (2007) 1364-1375.
Cite This Article
  • APA Style

    Xiaodong Zhang, Xuefeng Yan. (2016). Improvement of Echo State Network Generalization by Selective Ensemble Learning Based on BPSO. Automation, Control and Intelligent Systems, 4(6), 84-88. https://doi.org/10.11648/j.acis.20160406.11

    Copy | Download

    ACS Style

    Xiaodong Zhang; Xuefeng Yan. Improvement of Echo State Network Generalization by Selective Ensemble Learning Based on BPSO. Autom. Control Intell. Syst. 2016, 4(6), 84-88. doi: 10.11648/j.acis.20160406.11

    Copy | Download

    AMA Style

    Xiaodong Zhang, Xuefeng Yan. Improvement of Echo State Network Generalization by Selective Ensemble Learning Based on BPSO. Autom Control Intell Syst. 2016;4(6):84-88. doi: 10.11648/j.acis.20160406.11

    Copy | Download

  • @article{10.11648/j.acis.20160406.11,
      author = {Xiaodong Zhang and Xuefeng Yan},
      title = {Improvement of Echo State Network Generalization by Selective Ensemble Learning Based on BPSO},
      journal = {Automation, Control and Intelligent Systems},
      volume = {4},
      number = {6},
      pages = {84-88},
      doi = {10.11648/j.acis.20160406.11},
      url = {https://doi.org/10.11648/j.acis.20160406.11},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.acis.20160406.11},
      abstract = {The Echo State Network (ESN) is a novel and special type of recurrent neural network that has become increasingly popular in machine learning domains such as time series forecasting, data clustering, and nonlinear system identification. This network is characterized by large randomly constructed recurrent neural networks (RNN) called “reservoir”, in which the neurons are sparsely connected and the weights remain unchanged during training, leaving the simple training of the output layer. However, the reservoir is criticized for its randomness and instability because of the random initialization of the connectivity and weights. In this article, we introduced the selective ensemble learning based on BPSO to improve the generalization performance of ESN. Two widely studied tasks are used to prove the feasibility and priority of the selective ESN ensemble based on BPSO(SESNE-BPSO) model. And the results indicate that the SESNE-BPSO model performs better than the general ESN ensemble, the single standard ESN and several other improved ESN models.},
     year = {2016}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Improvement of Echo State Network Generalization by Selective Ensemble Learning Based on BPSO
    AU  - Xiaodong Zhang
    AU  - Xuefeng Yan
    Y1  - 2016/12/01
    PY  - 2016
    N1  - https://doi.org/10.11648/j.acis.20160406.11
    DO  - 10.11648/j.acis.20160406.11
    T2  - Automation, Control and Intelligent Systems
    JF  - Automation, Control and Intelligent Systems
    JO  - Automation, Control and Intelligent Systems
    SP  - 84
    EP  - 88
    PB  - Science Publishing Group
    SN  - 2328-5591
    UR  - https://doi.org/10.11648/j.acis.20160406.11
    AB  - The Echo State Network (ESN) is a novel and special type of recurrent neural network that has become increasingly popular in machine learning domains such as time series forecasting, data clustering, and nonlinear system identification. This network is characterized by large randomly constructed recurrent neural networks (RNN) called “reservoir”, in which the neurons are sparsely connected and the weights remain unchanged during training, leaving the simple training of the output layer. However, the reservoir is criticized for its randomness and instability because of the random initialization of the connectivity and weights. In this article, we introduced the selective ensemble learning based on BPSO to improve the generalization performance of ESN. Two widely studied tasks are used to prove the feasibility and priority of the selective ESN ensemble based on BPSO(SESNE-BPSO) model. And the results indicate that the SESNE-BPSO model performs better than the general ESN ensemble, the single standard ESN and several other improved ESN models.
    VL  - 4
    IS  - 6
    ER  - 

    Copy | Download

Author Information
  • Key Laboratory of Advanced Control and Optimization for Chemical Processes of Ministry of Education, East China University of Science and Technology, Shanghai, P. R. China

  • Key Laboratory of Advanced Control and Optimization for Chemical Processes of Ministry of Education, East China University of Science and Technology, Shanghai, P. R. China

  • Sections