| Peer-Reviewed

New Neural Network Corresponding to the Evolution Process of the Brain

Received: 21 January 2021    Accepted: 28 January 2021    Published: 9 February 2021
Views:       Downloads:
Abstract

In this paper, the logic is developed assuming that all parts of the brain are composed of a combination of modules that basically have the same structure. The fundamental function is the feeding behavior searching for food while avoiding the dangers. This is most necessary function of animals in the early stages of evolution and the basis of time series data processing. The module is presented by a neural network with learning capabilities based on Hebb's law and is called the basic unit. The basic units are placed on layers and the information between the layers is bidirectional. This new neural network is an extension of the traditional neural network that evolved from pattern recognition. The biggest feature is that in the process of processing time series data, the activated part in the neural network changes according to the context structure of the data. Predicts events from the context of learned behavior and selects best way. It is important to incorporate higher levels of intelligence such as learning, imitation functions furthermore long-term memory and object symbolization. A new neural network that deals the "descriptive world" that expresses past and future events to the neural network that deals the "real world" related to the familiar events is added. The scheme of neural network's function is shown using concept of category theory

Published in American Journal of Neural Networks and Applications (Volume 7, Issue 1)
DOI 10.11648/j.ajnna.20210701.11
Page(s) 1-6
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

Time-Series Data, Acceptance and Generation, Learning Context of Data, Prediction by Context, Extended Conventional DNN, Bidirectional Communication Between Layers, Mirror Neuron and Category Theory

References
[1] D. C. Dennet, “From Bacteria to Bach and Back: The Evolution of Minds”, In Penguin Books, 2018.
[2] Paolo Arena, Marco Cali, Luca Patane, Agnese Portera, Roland Strauss SC, “Modelling the insect Mushroom Bodies: Application to Sequence learning”, Neural Networks 67 (2015) 37-53.
[3] Gyorgy Buzsaki, “Rhythms of the Brain”, Oxford University Press, 2006, p43 Networks and Applications, Volume 6, Issue 1, June 2020.
[4] Qianli Ma, Wanqing Zhuang, Lifeng Shen, Garrison W. Cottrell, “Time series classification with Echo Memory Networks”, Neural Networks 117 (2019) 225-239.
[5] L. Andrew Cowaed, Ron Sun, “Hierarchical approaches to understanding consciousness”, Neural Networks 117 (2019).
[6] Hoon Keng Poon, Wun-She Yap, Yee-Kai Tee, Wai-Kong Lee, Bok-Min Goi “Hierarchical gated recurrent neural network with adversarial and virtual adversarial training on text classification”, Neural Networks 119 (2019) 299-312.
[7] Ann Sizemore, Chad Giusti, Ari Kahn, Richard F. Betzel, Danielle S. Bassett “Cliques and Cavities in the Human Connectome”, Cornell University arxiv.org/abs/1608.03520, 2016.
[8] Hiroshi Yamakawa, “Attentional Reinforcement Learning in the Brain”, New Generation Computing, January 2020.
[9] S. Yanagawa. 2017, “Learning Machine That Uses Context Structure to Search Policy”, https://jsai.ixsq.nii.ac.jp/. SIG-AGI-007-07.
[10] T M. Iacoboni, “Mirroring People”, Picador 33. 2008, Chapter 4, p106.
[11] S. Yanagawa “Each Role of Short-term and Long-term Memory in Neural Networks”, American Journal of Neural Networks and Applications, Volume 6, Issue 1, June 2020.
[12] T. Leinster “Basic Category Theory”, Cambridge University Press 2017.
[13] S. Yanagawa “A neural network that processes time series data in which element data is linked to symbols”, Third International Workshop on Symbolic-Neural Learning (SNL-2019) Poster presentation.
[14] S. Dehaene, Consciousness and Brain Penguin Books, 2014.
[15] Michael j. Healy, Thomas P. Caudill, “Episodic memory: A hierarchy of spatiotemporal concepts”, Neural Networks 120 (2019) 40-57.
[16] W. J. Freeman “How Brains Make up Their Minds” Weidenfeld & Nicolson Ltd. 1999.
[17] Kenji Doyaa & Tadahiro Taniguchi, “Toward Evolutionary and Developmental Intelligence”, Current Opinion in Behavioral Sciences Volume 29, October 2019, Pages 91-96.
Cite This Article
  • APA Style

    Seisuke Yanagawa. (2021). New Neural Network Corresponding to the Evolution Process of the Brain. American Journal of Neural Networks and Applications, 7(1), 1-6. https://doi.org/10.11648/j.ajnna.20210701.11

    Copy | Download

    ACS Style

    Seisuke Yanagawa. New Neural Network Corresponding to the Evolution Process of the Brain. Am. J. Neural Netw. Appl. 2021, 7(1), 1-6. doi: 10.11648/j.ajnna.20210701.11

    Copy | Download

    AMA Style

    Seisuke Yanagawa. New Neural Network Corresponding to the Evolution Process of the Brain. Am J Neural Netw Appl. 2021;7(1):1-6. doi: 10.11648/j.ajnna.20210701.11

    Copy | Download

  • @article{10.11648/j.ajnna.20210701.11,
      author = {Seisuke Yanagawa},
      title = {New Neural Network Corresponding to the Evolution Process of the Brain},
      journal = {American Journal of Neural Networks and Applications},
      volume = {7},
      number = {1},
      pages = {1-6},
      doi = {10.11648/j.ajnna.20210701.11},
      url = {https://doi.org/10.11648/j.ajnna.20210701.11},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajnna.20210701.11},
      abstract = {In this paper, the logic is developed assuming that all parts of the brain are composed of a combination of modules that basically have the same structure. The fundamental function is the feeding behavior searching for food while avoiding the dangers. This is most necessary function of animals in the early stages of evolution and the basis of time series data processing. The module is presented by a neural network with learning capabilities based on Hebb's law and is called the basic unit. The basic units are placed on layers and the information between the layers is bidirectional. This new neural network is an extension of the traditional neural network that evolved from pattern recognition. The biggest feature is that in the process of processing time series data, the activated part in the neural network changes according to the context structure of the data. Predicts events from the context of learned behavior and selects best way. It is important to incorporate higher levels of intelligence such as learning, imitation functions furthermore long-term memory and object symbolization. A new neural network that deals the "descriptive world" that expresses past and future events to the neural network that deals the "real world" related to the familiar events is added. The scheme of neural network's function is shown using concept of category theory},
     year = {2021}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - New Neural Network Corresponding to the Evolution Process of the Brain
    AU  - Seisuke Yanagawa
    Y1  - 2021/02/09
    PY  - 2021
    N1  - https://doi.org/10.11648/j.ajnna.20210701.11
    DO  - 10.11648/j.ajnna.20210701.11
    T2  - American Journal of Neural Networks and Applications
    JF  - American Journal of Neural Networks and Applications
    JO  - American Journal of Neural Networks and Applications
    SP  - 1
    EP  - 6
    PB  - Science Publishing Group
    SN  - 2469-7419
    UR  - https://doi.org/10.11648/j.ajnna.20210701.11
    AB  - In this paper, the logic is developed assuming that all parts of the brain are composed of a combination of modules that basically have the same structure. The fundamental function is the feeding behavior searching for food while avoiding the dangers. This is most necessary function of animals in the early stages of evolution and the basis of time series data processing. The module is presented by a neural network with learning capabilities based on Hebb's law and is called the basic unit. The basic units are placed on layers and the information between the layers is bidirectional. This new neural network is an extension of the traditional neural network that evolved from pattern recognition. The biggest feature is that in the process of processing time series data, the activated part in the neural network changes according to the context structure of the data. Predicts events from the context of learned behavior and selects best way. It is important to incorporate higher levels of intelligence such as learning, imitation functions furthermore long-term memory and object symbolization. A new neural network that deals the "descriptive world" that expresses past and future events to the neural network that deals the "real world" related to the familiar events is added. The scheme of neural network's function is shown using concept of category theory
    VL  - 7
    IS  - 1
    ER  - 

    Copy | Download

Author Information
  • OptID, Machida, Tokyo, Japan

  • Sections