Digit recognition using decimal coding and artificial neural network
DOI:
https://doi.org/10.48129/kjs.v49i1.9556Abstract
Current artificial neural network image recognition techniques use all pixels of the image as input. The aim objective of this work is to reduce the number of pixels by using input characteristics calculated from the initial image. The method presented in this research consists to extract the characteristics of digit image by coding each row of the image by a decimal value, i.e. passage of the binary representation into a decimal value, this method called decimal coding of rows. The set of the decimal values calculated from the initial image is arranged in the vector and these values represent the inputs of the artificial neural network. The proposed approach used in this work is based on a multilayer perceptron neural network for recognizing and predicting the handwritten digit from 0 to 9. In this study, a dataset of 1797 samples was obtained from the digit database imported from the Scikit-learn library. The backpropagation algorithm was used for the training dataset and feed-forward for the testing dataset. Results obtained in this work show that the proposed approach achieves better performance in terms of recognition and execution time.Downloads
Published
02-12-2021
Issue
Section
Computer Science