Self-Generation ART-1 Neural Network with Gradient-Descent Method Aid for Latin Alphabet Recognition
Abstract
Problem statement: In this study a self-generation ART-1 neural network that is an efficient algorithm that emulates the self-organizing pattern recognition developed to avoid the stability-plasticity dilemma in competitive networks learning, is presented for Latin alphabet recognition to use in a vision system for road sings recognition. Approach: The first step of our approach deals with the training process where a set of input vectors are presented sequentially to the preprocessor to specify the inputs for the networks. Secondly the value of the mean squared error was used to measure the candidate for the output in the recognition phase. Thirdly to move down the large error-surface created by delta rule during the search phase the gradient-descent is used by changing each value of the weights by an amount that is proportional to the negative of the sigmoid function slope. Results: In the simulation test our system can self organize in real time producing stable recognition while getting inputs pattern beyond those originally stored. It can preserve its previously learned knowledge while keeping its ability to learn new patterns. Conclusions: The result suggests that the proposed system is pertinent to be put in practical use.
DOI: https://doi.org/10.3844/jcssp.2008.631.637
Copyright: © 2008 Mbaïtiga Zacharie. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
- 3,018 Views
- 2,527 Downloads
- 0 Citations
Download
Keywords
- ART-1
- binary input vector
- character recognition
- Latin alphabet