Please enter verification code
Architecture of the Extended-Input Binary Neural Network and Applications
American Journal of Neural Networks and Applications
Volume 4, Issue 1, June 2018, Pages: 8-14
Received: Jun. 4, 2018; Accepted: Jun. 20, 2018; Published: Jul. 6, 2018
Views 1755      Downloads 256
Wafik Aziz Wassef, Department of Computer Engineering, Saskatchewan Institute of Applied Science and Technology, Moose Jaw, Canada
Article Tools
Follow on us
The proposed architecture of a binary artificial neural network is inspired by the structure and function of the major parts of the brain. Consequently it is divided into an input module that resemble the sensory (stimuli) area and an output module similar to the motor (responses) area. These two modules are single layer feed forward neural networks and have fixed weights to transform input patterns into a simple code and then to convert this code back to output patterns. All possible input and output patterns are stored in the weights of these two modules. Each output pattern can be produced by a single neuron of the output module asserted high. Similarly each input pattern produces a single input module neuron at binary 1. The training of this neural network is confined to connecting one output neuron of the input module at binary 1 that represents a code for the input pattern and one input neuron of the output module that produces the desired associated output pattern. Thus fast and accurate association between input and output pattern pairs can be achieved. These connections can be implemented by a crossbar switch. This crossbar switch acts similar to the thalamus in the brain which is considered to be a relay center. The role of the crossbar switch is generalized to an electric field in the gap between input and output modules and it is postulated that this field may be considered as a bridge between the brain and mental states. The input module encoder is preceded by the extended input circuit which ensures that the inverse of the input matrix exists and at the same time to make the derivation of this inverse of any order a simple task. This circuit mimics the processing function of the region in the brain that process input signals before sending them to the sensory region. Some applications of this neural network are: logical relations, mathematical operations, as a memory device and for pattern association. The number of input neurons can be increased (increased dimensionality) by multiplexing those inputs and using latches and multi-input AND gates. It is concluded that by emulating the major structures of the brain using artificial neural networks the performance of these networks can be enhanced greatly by increasing their speed, increasing their memory capacities and by performing a wide range of applications.
Architecture, Modular, Pattern Association, Mathematical Operations
To cite this article
Wafik Aziz Wassef, Architecture of the Extended-Input Binary Neural Network and Applications, American Journal of Neural Networks and Applications. Vol. 4, No. 1, 2018, pp. 8-14. doi: 10.11648/j.ajnna.20180401.12
Copyright © 2018 Authors retain the copyright of this article.
This article is an open access article distributed under the Creative Commons Attribution License ( which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Wassef, W. A., 2002, Extended Input Neural Network: Applications, Implementation and Learning, Neural Parallel & Scientific Computations, vol. 10, issue 4, pp. 387-410. Dynamic Publishers, Inc. GA.
Wassef, W. A., 2012, Time Series Prediction and Hardware of the Extended Input Neural Network (EINN), Neural, Parallel & Scientific Computations, vol. 20, issue 3/4 pp. 475-482, Dynamic Publishers, Inc. GA.
Bisi,M. And Goyal, N. K., 2017, Artifical Neural Network Applications for Software Reliability Prediction, Wiley-Scrivener, NY USA.
Kosko, B., 1987, Adaptive Bidirectional Associative Memories, Applied Optics, vol. 26, no. 23, pp. 4947–4960.
Kosko, B., 1992, Neural Networks and Fuzzy Systems, Prentice Hall, Englewood Cliffs, NJ, USA.
Glesner, M. And Puchmuller, W., 1994, Neurocomputer, An Overview of Neural networks in VLSI. Chapman and Hall, Hew York, USA, pp. 192–198.
Cover, T. M., 1965, IEEE Transactions on Electronic Computers 14, pp. 326-334.
Anton, H., 2000, Elementary Linear Algebra, 8th ed. John Wiley & Sons, New York, p. 68.
Kurzweil, R., 2012, How to Create a Mind, Viking, Published by the Penguin Group Inc., N.Y., USA, pp. 84-85.
Smythies, J. R., 2014, Brain and Mind: Modern Concepts of the Nature of Mind, Rotledge, N.Y, USA.
Wilkinson, R., 2013, Minds and Bodies, Rotledge, N.Y. USA.
Carter, M. 2007, Minds and Computers, An Introduction to the Philosophy of Artificial Intelligence, Edinburgh University Press, Edinburgh, UK, p. 36.
Armstrong, D. 1968, A Materialist Theory of the Mind, Routledge and Kegan Paul, London.
Science Publishing Group
1 Rockefeller Plaza,
10th and 11th Floors,
New York, NY 10020
Tel: (001)347-983-5186